Jan 21 14:29:21 crc systemd[1]: Starting Kubernetes Kubelet... Jan 21 14:29:21 crc restorecon[4654]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:21 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 21 14:29:22 crc restorecon[4654]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 21 14:29:22 crc kubenswrapper[4720]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 14:29:22 crc kubenswrapper[4720]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 21 14:29:22 crc kubenswrapper[4720]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 14:29:22 crc kubenswrapper[4720]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 14:29:22 crc kubenswrapper[4720]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 21 14:29:22 crc kubenswrapper[4720]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.204266 4720 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207191 4720 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207211 4720 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207219 4720 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207224 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207228 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207232 4720 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207236 4720 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207239 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207244 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207248 4720 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207251 4720 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207257 4720 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207262 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207266 4720 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207270 4720 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207274 4720 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207278 4720 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207281 4720 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207284 4720 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207288 4720 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207292 4720 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207295 4720 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207299 4720 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207302 4720 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207306 4720 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207311 4720 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207315 4720 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207319 4720 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207323 4720 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207326 4720 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207330 4720 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207334 4720 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207337 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207340 4720 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207344 4720 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207348 4720 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207351 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207355 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207360 4720 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207364 4720 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207368 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207372 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207375 4720 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207379 4720 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207383 4720 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207388 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207392 4720 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207395 4720 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207399 4720 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207402 4720 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207405 4720 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207409 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207413 4720 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207416 4720 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207421 4720 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207426 4720 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207430 4720 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207434 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207438 4720 feature_gate.go:330] unrecognized feature gate: Example Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207442 4720 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207445 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207449 4720 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207452 4720 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207456 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207459 4720 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207463 4720 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207466 4720 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207469 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207473 4720 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207477 4720 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.207482 4720 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207566 4720 flags.go:64] FLAG: --address="0.0.0.0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207575 4720 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207581 4720 flags.go:64] FLAG: --anonymous-auth="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207586 4720 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207591 4720 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207596 4720 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207602 4720 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207607 4720 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207611 4720 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207615 4720 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207620 4720 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207624 4720 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207628 4720 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207632 4720 flags.go:64] FLAG: --cgroup-root="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207636 4720 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207641 4720 flags.go:64] FLAG: --client-ca-file="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207645 4720 flags.go:64] FLAG: --cloud-config="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207652 4720 flags.go:64] FLAG: --cloud-provider="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207672 4720 flags.go:64] FLAG: --cluster-dns="[]" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207679 4720 flags.go:64] FLAG: --cluster-domain="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207684 4720 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207689 4720 flags.go:64] FLAG: --config-dir="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207694 4720 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207698 4720 flags.go:64] FLAG: --container-log-max-files="5" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207704 4720 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207708 4720 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207712 4720 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207717 4720 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207721 4720 flags.go:64] FLAG: --contention-profiling="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207725 4720 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207729 4720 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207733 4720 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207737 4720 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207743 4720 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207747 4720 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207752 4720 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207756 4720 flags.go:64] FLAG: --enable-load-reader="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207761 4720 flags.go:64] FLAG: --enable-server="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207766 4720 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207771 4720 flags.go:64] FLAG: --event-burst="100" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207775 4720 flags.go:64] FLAG: --event-qps="50" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207779 4720 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207783 4720 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207788 4720 flags.go:64] FLAG: --eviction-hard="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207792 4720 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207796 4720 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207800 4720 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207805 4720 flags.go:64] FLAG: --eviction-soft="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207809 4720 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207813 4720 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207816 4720 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207820 4720 flags.go:64] FLAG: --experimental-mounter-path="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207824 4720 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207828 4720 flags.go:64] FLAG: --fail-swap-on="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207832 4720 flags.go:64] FLAG: --feature-gates="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207837 4720 flags.go:64] FLAG: --file-check-frequency="20s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207841 4720 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207845 4720 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207850 4720 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207854 4720 flags.go:64] FLAG: --healthz-port="10248" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207858 4720 flags.go:64] FLAG: --help="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207862 4720 flags.go:64] FLAG: --hostname-override="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207866 4720 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207870 4720 flags.go:64] FLAG: --http-check-frequency="20s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207874 4720 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207878 4720 flags.go:64] FLAG: --image-credential-provider-config="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207881 4720 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207885 4720 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207889 4720 flags.go:64] FLAG: --image-service-endpoint="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207899 4720 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207903 4720 flags.go:64] FLAG: --kube-api-burst="100" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207908 4720 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207912 4720 flags.go:64] FLAG: --kube-api-qps="50" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207916 4720 flags.go:64] FLAG: --kube-reserved="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207920 4720 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207924 4720 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207928 4720 flags.go:64] FLAG: --kubelet-cgroups="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207932 4720 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207936 4720 flags.go:64] FLAG: --lock-file="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207940 4720 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207944 4720 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207948 4720 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207954 4720 flags.go:64] FLAG: --log-json-split-stream="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207958 4720 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207963 4720 flags.go:64] FLAG: --log-text-split-stream="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207967 4720 flags.go:64] FLAG: --logging-format="text" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207971 4720 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207975 4720 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207979 4720 flags.go:64] FLAG: --manifest-url="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207983 4720 flags.go:64] FLAG: --manifest-url-header="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207988 4720 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207992 4720 flags.go:64] FLAG: --max-open-files="1000000" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.207997 4720 flags.go:64] FLAG: --max-pods="110" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208001 4720 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208005 4720 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208009 4720 flags.go:64] FLAG: --memory-manager-policy="None" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208013 4720 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208017 4720 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208021 4720 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208026 4720 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208037 4720 flags.go:64] FLAG: --node-status-max-images="50" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208042 4720 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208048 4720 flags.go:64] FLAG: --oom-score-adj="-999" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208053 4720 flags.go:64] FLAG: --pod-cidr="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208058 4720 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208067 4720 flags.go:64] FLAG: --pod-manifest-path="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208072 4720 flags.go:64] FLAG: --pod-max-pids="-1" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208077 4720 flags.go:64] FLAG: --pods-per-core="0" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208082 4720 flags.go:64] FLAG: --port="10250" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208087 4720 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208091 4720 flags.go:64] FLAG: --provider-id="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208095 4720 flags.go:64] FLAG: --qos-reserved="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208098 4720 flags.go:64] FLAG: --read-only-port="10255" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208103 4720 flags.go:64] FLAG: --register-node="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208107 4720 flags.go:64] FLAG: --register-schedulable="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208111 4720 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208118 4720 flags.go:64] FLAG: --registry-burst="10" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208122 4720 flags.go:64] FLAG: --registry-qps="5" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208126 4720 flags.go:64] FLAG: --reserved-cpus="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208130 4720 flags.go:64] FLAG: --reserved-memory="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208145 4720 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208150 4720 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208155 4720 flags.go:64] FLAG: --rotate-certificates="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208160 4720 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208165 4720 flags.go:64] FLAG: --runonce="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208170 4720 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208176 4720 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208180 4720 flags.go:64] FLAG: --seccomp-default="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208184 4720 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208188 4720 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208192 4720 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208197 4720 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208201 4720 flags.go:64] FLAG: --storage-driver-password="root" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208205 4720 flags.go:64] FLAG: --storage-driver-secure="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208209 4720 flags.go:64] FLAG: --storage-driver-table="stats" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208214 4720 flags.go:64] FLAG: --storage-driver-user="root" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208217 4720 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208222 4720 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208227 4720 flags.go:64] FLAG: --system-cgroups="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208232 4720 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208240 4720 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208246 4720 flags.go:64] FLAG: --tls-cert-file="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208251 4720 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208261 4720 flags.go:64] FLAG: --tls-min-version="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208266 4720 flags.go:64] FLAG: --tls-private-key-file="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208271 4720 flags.go:64] FLAG: --topology-manager-policy="none" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208276 4720 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208280 4720 flags.go:64] FLAG: --topology-manager-scope="container" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208284 4720 flags.go:64] FLAG: --v="2" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208290 4720 flags.go:64] FLAG: --version="false" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208295 4720 flags.go:64] FLAG: --vmodule="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208300 4720 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208305 4720 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208413 4720 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208425 4720 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208430 4720 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208435 4720 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208440 4720 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208444 4720 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208448 4720 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208453 4720 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208457 4720 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208461 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208466 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208470 4720 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208474 4720 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208479 4720 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208483 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208488 4720 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208492 4720 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208496 4720 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208500 4720 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208505 4720 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208511 4720 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208517 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208522 4720 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208527 4720 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208532 4720 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208537 4720 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208541 4720 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208546 4720 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208550 4720 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208556 4720 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208562 4720 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208566 4720 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208571 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208579 4720 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208584 4720 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208587 4720 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208591 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208595 4720 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208598 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208601 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208605 4720 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208608 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208612 4720 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208618 4720 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208624 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208630 4720 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208635 4720 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208640 4720 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208645 4720 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208673 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208679 4720 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208684 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208689 4720 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208694 4720 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208699 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208705 4720 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208712 4720 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208717 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208722 4720 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208728 4720 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208734 4720 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208739 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208743 4720 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208748 4720 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208752 4720 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208759 4720 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208763 4720 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208768 4720 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208772 4720 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208776 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.208781 4720 feature_gate.go:330] unrecognized feature gate: Example Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.208960 4720 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.217468 4720 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.217503 4720 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217675 4720 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217693 4720 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217706 4720 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217713 4720 feature_gate.go:330] unrecognized feature gate: Example Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217721 4720 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217729 4720 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217737 4720 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217744 4720 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217753 4720 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217760 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217767 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217773 4720 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217781 4720 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217787 4720 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217794 4720 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217801 4720 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217807 4720 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217814 4720 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217821 4720 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217827 4720 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217834 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217841 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217848 4720 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217858 4720 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217866 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217874 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217881 4720 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217888 4720 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217895 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217902 4720 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217909 4720 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217917 4720 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217924 4720 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217931 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217939 4720 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217947 4720 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217953 4720 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217960 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217967 4720 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217976 4720 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217982 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217991 4720 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.217999 4720 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218006 4720 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218013 4720 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218020 4720 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218026 4720 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218033 4720 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218039 4720 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218046 4720 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218053 4720 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218059 4720 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218066 4720 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218073 4720 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218080 4720 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218086 4720 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218093 4720 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218100 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218106 4720 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218113 4720 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218120 4720 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218126 4720 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218133 4720 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218141 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218148 4720 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218154 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218164 4720 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218172 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218179 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218187 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218195 4720 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.218207 4720 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218442 4720 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218456 4720 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218465 4720 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218473 4720 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218481 4720 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218488 4720 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218495 4720 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218505 4720 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218514 4720 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218522 4720 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218530 4720 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218536 4720 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218544 4720 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218550 4720 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218557 4720 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218564 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218570 4720 feature_gate.go:330] unrecognized feature gate: Example Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218577 4720 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218584 4720 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218591 4720 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218598 4720 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218604 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218611 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218618 4720 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218625 4720 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218632 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218639 4720 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218646 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218652 4720 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218680 4720 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218688 4720 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218694 4720 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218700 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218707 4720 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218716 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218723 4720 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218730 4720 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218737 4720 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218744 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218751 4720 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218757 4720 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218764 4720 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218770 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218777 4720 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218784 4720 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218792 4720 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218800 4720 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218807 4720 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218813 4720 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218820 4720 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218826 4720 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218833 4720 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218842 4720 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218851 4720 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218859 4720 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218868 4720 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218877 4720 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218883 4720 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218890 4720 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218898 4720 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218904 4720 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218911 4720 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218918 4720 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218925 4720 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218931 4720 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218938 4720 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218944 4720 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218951 4720 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218958 4720 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218964 4720 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.218973 4720 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.218984 4720 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.219524 4720 server.go:940] "Client rotation is on, will bootstrap in background" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.225411 4720 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.225616 4720 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.226922 4720 server.go:997] "Starting client certificate rotation" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.226950 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.227111 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-26 04:13:21.208792087 +0000 UTC Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.227251 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.232804 4720 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.234096 4720 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.235332 4720 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.244907 4720 log.go:25] "Validated CRI v1 runtime API" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.259459 4720 log.go:25] "Validated CRI v1 image API" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.260731 4720 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.262567 4720 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-21-14-23-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.262597 4720 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.272966 4720 manager.go:217] Machine: {Timestamp:2026-01-21 14:29:22.271768908 +0000 UTC m=+0.180508850 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199484928 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:35e04af4-02ae-4e02-a92c-e5d6146fb726 BootID:4d815218-ae38-4c10-ad0e-53cee38645ef Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076109 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599742464 Type:vfs Inodes:3076109 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c3:c2:74 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c3:c2:74 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:78:24:f1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ee:ce:9e Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:13:44:47 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:25:c8:a0 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:ee:a8:b3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:d0:c3:dd:1b:b4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:17:c0:af:c7:bb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199484928 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.273149 4720 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.273331 4720 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274008 4720 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274191 4720 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274225 4720 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274396 4720 topology_manager.go:138] "Creating topology manager with none policy" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274405 4720 container_manager_linux.go:303] "Creating device plugin manager" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274604 4720 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274646 4720 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274841 4720 state_mem.go:36] "Initialized new in-memory state store" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.274921 4720 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.275953 4720 kubelet.go:418] "Attempting to sync node with API server" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.275978 4720 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.276012 4720 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.276025 4720 kubelet.go:324] "Adding apiserver pod source" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.276035 4720 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.278277 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.278390 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.278280 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.278541 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.279230 4720 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.279681 4720 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.280749 4720 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281347 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281434 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281491 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281540 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281643 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281718 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281779 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281834 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281889 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281941 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.281992 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.282044 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.282427 4720 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.282926 4720 server.go:1280] "Started kubelet" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.283260 4720 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.283329 4720 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.283343 4720 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.284101 4720 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 21 14:29:22 crc systemd[1]: Started Kubernetes Kubelet. Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.285157 4720 server.go:460] "Adding debug handlers to kubelet server" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.286644 4720 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188cc55e427dc851 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 14:29:22.282899537 +0000 UTC m=+0.191639469,LastTimestamp:2026-01-21 14:29:22.282899537 +0000 UTC m=+0.191639469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.288383 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.288493 4720 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.288899 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:20:00.71040662 +0000 UTC Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.290110 4720 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.290138 4720 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.290257 4720 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.290816 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.290938 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.291178 4720 factory.go:55] Registering systemd factory Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.291248 4720 factory.go:221] Registration of the systemd container factory successfully Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.291551 4720 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.291719 4720 factory.go:153] Registering CRI-O factory Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.291790 4720 factory.go:221] Registration of the crio container factory successfully Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.291720 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="200ms" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.291983 4720 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.292070 4720 factory.go:103] Registering Raw factory Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.292152 4720 manager.go:1196] Started watching for new ooms in manager Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.292672 4720 manager.go:319] Starting recovery of all containers Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366272 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366520 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366539 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366551 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366560 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366570 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366579 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366589 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366600 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366609 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366619 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366630 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366640 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366651 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366679 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366689 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366698 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366709 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366720 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366729 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366738 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366748 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366757 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366768 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366777 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366787 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366801 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366811 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366822 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366832 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.366847 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.376973 4720 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377033 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377054 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377070 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377087 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377104 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377118 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377133 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377147 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377162 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377176 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377192 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377205 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377220 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377233 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377247 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377260 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377275 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377289 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377303 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377321 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377337 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377358 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377386 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377400 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377416 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377431 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377444 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377456 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377471 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377485 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377498 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377511 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377523 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377537 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377550 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377565 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377588 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377604 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377619 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377686 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377700 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377715 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377744 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377757 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377771 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377785 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377797 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377811 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377824 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377836 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377849 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377865 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377878 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377890 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377903 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377917 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377930 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377942 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377955 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377968 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377982 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.377995 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378010 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378023 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378037 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378048 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378096 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378110 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378122 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378135 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378149 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378161 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378177 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378197 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378213 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378228 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378242 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378257 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378270 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378285 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378300 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378313 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378326 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378340 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378368 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378383 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378396 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378408 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378421 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378434 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378447 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378459 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378470 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378482 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378493 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378506 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378516 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378527 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378538 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378551 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378561 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378574 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378585 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378597 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378612 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378624 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378635 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378646 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378702 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378716 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378727 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378740 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378753 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378764 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378776 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378790 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378803 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378816 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378830 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378843 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378855 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378868 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378881 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378893 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378905 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378918 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378932 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378943 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378957 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378971 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378983 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.378997 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379009 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379020 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379032 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379043 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379056 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379067 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379080 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379092 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379104 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379115 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379126 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379137 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379148 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379161 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379174 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379188 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379199 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379211 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379222 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379234 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379246 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379258 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379270 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379284 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379296 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379309 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379322 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379335 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379350 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379363 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379375 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379389 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379401 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379415 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379427 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379441 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379453 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379466 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379478 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379490 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379503 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379517 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379530 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379543 4720 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379554 4720 reconstruct.go:97] "Volume reconstruction finished" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379562 4720 reconciler.go:26] "Reconciler: start to sync state" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.379686 4720 manager.go:324] Recovery completed Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.388073 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.389835 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.389870 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.389882 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.392996 4720 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.393148 4720 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.393243 4720 state_mem.go:36] "Initialized new in-memory state store" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.394023 4720 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.493031 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="400ms" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.494201 4720 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.595467 4720 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.670973 4720 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.676424 4720 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.676818 4720 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.676987 4720 kubelet.go:2335] "Starting kubelet main sync loop" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.677068 4720 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 21 14:29:22 crc kubenswrapper[4720]: W0121 14:29:22.678514 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.678583 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.696423 4720 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.712643 4720 policy_none.go:49] "None policy: Start" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.713589 4720 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.713615 4720 state_mem.go:35] "Initializing new in-memory state store" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.760637 4720 manager.go:334] "Starting Device Plugin manager" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.760704 4720 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.760715 4720 server.go:79] "Starting device plugin registration server" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.761106 4720 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.761128 4720 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.762051 4720 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.762127 4720 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.762142 4720 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.769305 4720 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.777552 4720 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.777719 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.778892 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.778918 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.778929 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.779055 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.779565 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.779601 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780180 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780200 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780249 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780733 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780756 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780766 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.780863 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.781227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.781262 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.781869 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.781888 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.781898 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.782347 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.782373 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.782386 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.782461 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.782574 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.782612 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783317 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783339 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783348 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783433 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783513 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783528 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783548 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783798 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.783816 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784350 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784384 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784395 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784415 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784434 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784444 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.784953 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.785634 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.787473 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.787500 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.787510 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.861433 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.863432 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.863487 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.863500 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.863531 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.864102 4720 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884431 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884504 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884526 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884547 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884622 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884695 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884734 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884815 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884846 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884877 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884903 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884934 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884960 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.884985 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.885007 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: E0121 14:29:22.893902 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="800ms" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985784 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985846 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985914 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985944 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985969 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.985994 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986048 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986076 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986105 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986130 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986153 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986132 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986218 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986190 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986249 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986180 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986303 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986325 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986348 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986280 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986337 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986096 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986413 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986422 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986471 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986431 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:22 crc kubenswrapper[4720]: I0121 14:29:22.986162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.064698 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.065887 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.065917 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.065927 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.065947 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.066412 4720 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.113053 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.133591 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.133727 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.134789 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.152612 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4e549b32b90d6fea49962f10ae2c214983eb973a60f8e8fd2d57382535ce2233 WatchSource:0}: Error finding container 4e549b32b90d6fea49962f10ae2c214983eb973a60f8e8fd2d57382535ce2233: Status 404 returned error can't find the container with id 4e549b32b90d6fea49962f10ae2c214983eb973a60f8e8fd2d57382535ce2233 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.153072 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.160279 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.165169 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9e7918b04f23e23124745b3ff751d4d4dce1c5a5364799ff3889a74e71acda13 WatchSource:0}: Error finding container 9e7918b04f23e23124745b3ff751d4d4dce1c5a5364799ff3889a74e71acda13: Status 404 returned error can't find the container with id 9e7918b04f23e23124745b3ff751d4d4dce1c5a5364799ff3889a74e71acda13 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.165325 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.179176 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-513525aadd9bbf1ab5e9dfb5fa1c2a9873747fcfe84276d5083a786773989cae WatchSource:0}: Error finding container 513525aadd9bbf1ab5e9dfb5fa1c2a9873747fcfe84276d5083a786773989cae: Status 404 returned error can't find the container with id 513525aadd9bbf1ab5e9dfb5fa1c2a9873747fcfe84276d5083a786773989cae Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.182759 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-edeb88d5bcf64ead903e606855a5e472d8e61e1f595e234ec8963a91d59b1344 WatchSource:0}: Error finding container edeb88d5bcf64ead903e606855a5e472d8e61e1f595e234ec8963a91d59b1344: Status 404 returned error can't find the container with id edeb88d5bcf64ead903e606855a5e472d8e61e1f595e234ec8963a91d59b1344 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.284182 4720 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.289301 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 17:11:20.024458481 +0000 UTC Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.466631 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.469397 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.469434 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.469443 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.469469 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.469929 4720 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.677856 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.677926 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.680537 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.680601 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.685274 4720 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2ff2c303f39d89fe4e0e43df3ed8412db124ae2441b88798ce8aef4381965de7" exitCode=0 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.685366 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2ff2c303f39d89fe4e0e43df3ed8412db124ae2441b88798ce8aef4381965de7"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.685452 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"513525aadd9bbf1ab5e9dfb5fa1c2a9873747fcfe84276d5083a786773989cae"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.685559 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.686918 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.686943 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.686952 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.687395 4720 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4d820fe169517c34ffd70519cdac2dc46d0ce4eecc8fe6b3731463492bb64e45" exitCode=0 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.687458 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4d820fe169517c34ffd70519cdac2dc46d0ce4eecc8fe6b3731463492bb64e45"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.687477 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9e7918b04f23e23124745b3ff751d4d4dce1c5a5364799ff3889a74e71acda13"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.687537 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.689139 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.689155 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.689163 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.690837 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.690873 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e549b32b90d6fea49962f10ae2c214983eb973a60f8e8fd2d57382535ce2233"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.692178 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" exitCode=0 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.692255 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.692280 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ad651d62c40c0f48e28e233d7b370356f85f7e8407b3b92cc7589a2ef60537bb"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.692373 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.693049 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.693077 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.693088 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.693924 4720 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="77060f77a4769e8fdd43e676cb012de7406e40c487a4b9cdf48ce898e7429897" exitCode=0 Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.693960 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"77060f77a4769e8fdd43e676cb012de7406e40c487a4b9cdf48ce898e7429897"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.693981 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"edeb88d5bcf64ead903e606855a5e472d8e61e1f595e234ec8963a91d59b1344"} Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.694042 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.694472 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="1.6s" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.694784 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.694815 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.694826 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.697124 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.699128 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.699155 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:23 crc kubenswrapper[4720]: I0121 14:29:23.699164 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:23 crc kubenswrapper[4720]: W0121 14:29:23.858786 4720 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:23 crc kubenswrapper[4720]: E0121 14:29:23.858858 4720 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.270372 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.272727 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.272777 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.272788 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.272812 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:24 crc kubenswrapper[4720]: E0121 14:29:24.273229 4720 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.103:6443: connect: connection refused" node="crc" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.284925 4720 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.103:6443: connect: connection refused Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.290265 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:02:30.971444 +0000 UTC Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.358952 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 14:29:24 crc kubenswrapper[4720]: E0121 14:29:24.360234 4720 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.103:6443: connect: connection refused" logger="UnhandledError" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.698367 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.698421 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.698425 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.698530 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.699216 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.699245 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.699254 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.700788 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.700823 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.700833 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.700841 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.702701 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b29dd6c22bb5b3004238c10e3d17989b45ec75bbb771ab2565a8624a240e59f0"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.702777 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.703567 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.703603 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.703624 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.705105 4720 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a83b37dd39eded707ce491f35f6f58ae2b269305d39a37b58df89741dcbd9dc0" exitCode=0 Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.705164 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a83b37dd39eded707ce491f35f6f58ae2b269305d39a37b58df89741dcbd9dc0"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.705295 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.706030 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.706054 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.706062 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.707204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"897598273bdf66f89a506dc81686a05fcb6b1bc1b02f470094adc4fbc1058262"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.707236 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7d438afa16f1838ce25db6be8e67b6df52d63ebe4580f22209f49b5c24e7f795"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.707250 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cbd5137376c8f776e2ba672fd33d80611152b89c40b3d98a2b6f9d540f562a83"} Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.707340 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.708279 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.708302 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.708310 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:24 crc kubenswrapper[4720]: I0121 14:29:24.769418 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.290999 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 09:11:07.212698703 +0000 UTC Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.711616 4720 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dc51408031a82f0393a5899c6aeccea619de355df4aaea738a9e2cea42f97f9c" exitCode=0 Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.711678 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dc51408031a82f0393a5899c6aeccea619de355df4aaea738a9e2cea42f97f9c"} Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.711862 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.712685 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.712712 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.712721 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.714875 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.714917 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.714945 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.714876 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754"} Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.715636 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.715665 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.715673 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.715700 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.715717 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.715728 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.716246 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.716272 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.716281 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.873296 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.874304 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.874351 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.874388 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:25 crc kubenswrapper[4720]: I0121 14:29:25.874436 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.291814 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 04:38:53.802636535 +0000 UTC Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.721833 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9fc54131aaaa6ba0ecbb9ceba17e333aa096b7393d6b14d871cc7a8516b9ef2e"} Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.721883 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"09a7199dbfd55fab4729957a31e0c5480242055435d9998d8c175d32f2d11a50"} Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.721899 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bc6ac79aafa58e2df9a3dbc4eacc6eee2e46f700792c4821d1fdeabf716f8d4c"} Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.721901 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.721958 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.722021 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.721912 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9a9a62c0b95a2386bccaea9b78f10c7c4358adab9c1e80f78d74d41a72df2e1c"} Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.722137 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"baa2e1453f4dfc431a240b0d4f1864dd290358c1787e094c5b812de30e6fdf66"} Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.723116 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.723142 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.723150 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.723199 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.723230 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.723246 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:26 crc kubenswrapper[4720]: I0121 14:29:26.869730 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.292755 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 05:46:16.982417127 +0000 UTC Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.724308 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.725164 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.725204 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.725221 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.887002 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.887173 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.888345 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.888445 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:27 crc kubenswrapper[4720]: I0121 14:29:27.888541 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:28 crc kubenswrapper[4720]: I0121 14:29:28.293743 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 00:54:30.502018823 +0000 UTC Jan 21 14:29:28 crc kubenswrapper[4720]: I0121 14:29:28.651730 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 21 14:29:28 crc kubenswrapper[4720]: I0121 14:29:28.726443 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:28 crc kubenswrapper[4720]: I0121 14:29:28.727400 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:28 crc kubenswrapper[4720]: I0121 14:29:28.727444 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:28 crc kubenswrapper[4720]: I0121 14:29:28.727455 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.294388 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 05:11:54.391181663 +0000 UTC Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.441389 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.441584 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.441628 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.442818 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.442846 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:29 crc kubenswrapper[4720]: I0121 14:29:29.442857 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.295248 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 04:10:56.137601599 +0000 UTC Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.631458 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.631636 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.631686 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.633187 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.633222 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.633234 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.665550 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.665722 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.667053 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.667124 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:30 crc kubenswrapper[4720]: I0121 14:29:30.667146 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:31 crc kubenswrapper[4720]: I0121 14:29:31.295472 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 21:32:55.58593113 +0000 UTC Jan 21 14:29:31 crc kubenswrapper[4720]: I0121 14:29:31.734951 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:31 crc kubenswrapper[4720]: I0121 14:29:31.735086 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:31 crc kubenswrapper[4720]: I0121 14:29:31.736143 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:31 crc kubenswrapper[4720]: I0121 14:29:31.736181 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:31 crc kubenswrapper[4720]: I0121 14:29:31.736190 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.296127 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:09:11.399259942 +0000 UTC Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.318472 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.318691 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.319863 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.319898 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.319907 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.745194 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.745487 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.747412 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.747679 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.747693 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.750595 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:32 crc kubenswrapper[4720]: E0121 14:29:32.769471 4720 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.977125 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.977304 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.978327 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.978356 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:32 crc kubenswrapper[4720]: I0121 14:29:32.978365 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.296356 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:32:27.355954095 +0000 UTC Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.665995 4720 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.666069 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.741739 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.743010 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.743049 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.743063 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:33 crc kubenswrapper[4720]: I0121 14:29:33.748554 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.297187 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 22:57:01.622311956 +0000 UTC Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.743487 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.744796 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.744947 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.745023 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.796931 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.797031 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.810581 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 21 14:29:34 crc kubenswrapper[4720]: I0121 14:29:34.810678 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 21 14:29:35 crc kubenswrapper[4720]: I0121 14:29:35.298384 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 12:42:23.16352597 +0000 UTC Jan 21 14:29:36 crc kubenswrapper[4720]: I0121 14:29:36.298598 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:41:32.108664134 +0000 UTC Jan 21 14:29:36 crc kubenswrapper[4720]: I0121 14:29:36.657376 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 21 14:29:36 crc kubenswrapper[4720]: I0121 14:29:36.657470 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 21 14:29:37 crc kubenswrapper[4720]: I0121 14:29:37.299743 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 18:40:42.073794524 +0000 UTC Jan 21 14:29:38 crc kubenswrapper[4720]: I0121 14:29:38.300702 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:24:44.728625392 +0000 UTC Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.301638 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 09:23:43.968631344 +0000 UTC Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.445956 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.446140 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.446754 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.446884 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.447137 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.447230 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.447310 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.450209 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.755431 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.756189 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.756291 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.756303 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.756325 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.756337 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:39 crc kubenswrapper[4720]: E0121 14:29:39.801865 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.805453 4720 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.806485 4720 trace.go:236] Trace[1265285902]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:26.259) (total time: 13546ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1265285902]: ---"Objects listed" error: 13546ms (14:29:39.806) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1265285902]: [13.546843953s] [13.546843953s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.806503 4720 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.806994 4720 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.807585 4720 trace.go:236] Trace[1620683101]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:25.838) (total time: 13969ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1620683101]: ---"Objects listed" error: 13969ms (14:29:39.807) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1620683101]: [13.969146287s] [13.969146287s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.807604 4720 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808089 4720 trace.go:236] Trace[1077416000]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:25.148) (total time: 14659ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1077416000]: ---"Objects listed" error: 14659ms (14:29:39.807) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[1077416000]: [14.659375726s] [14.659375726s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808110 4720 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808397 4720 trace.go:236] Trace[798014851]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (21-Jan-2026 14:29:27.041) (total time: 12766ms): Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[798014851]: ---"Objects listed" error: 12766ms (14:29:39.808) Jan 21 14:29:39 crc kubenswrapper[4720]: Trace[798014851]: [12.766663278s] [12.766663278s] END Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.808417 4720 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 14:29:39 crc kubenswrapper[4720]: E0121 14:29:39.809938 4720 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.842336 4720 csr.go:261] certificate signing request csr-x7cfs is approved, waiting to be issued Jan 21 14:29:39 crc kubenswrapper[4720]: I0121 14:29:39.859140 4720 csr.go:257] certificate signing request csr-x7cfs is issued Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.286092 4720 apiserver.go:52] "Watching apiserver" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290114 4720 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290338 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290655 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290837 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290905 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.290959 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.291005 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.291069 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.291716 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.291760 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.291792 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.292458 4720 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.293772 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294151 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294182 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294483 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.294574 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.295359 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.297211 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.297342 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.297711 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.301927 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:11:52.193791656 +0000 UTC Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309223 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309270 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309318 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309345 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309363 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309381 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309400 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309423 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309445 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309464 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309483 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309503 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309557 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309578 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309613 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309633 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309657 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309693 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309733 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309755 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309774 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309812 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309834 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309854 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309874 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309894 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309918 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309967 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309980 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.309987 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310045 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310077 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310101 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310123 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310144 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310166 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310190 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310211 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310232 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310255 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310278 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310320 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310364 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310385 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310405 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310433 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310458 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310479 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310495 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310509 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310526 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310542 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310558 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310572 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310617 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310633 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310646 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310663 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310693 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310848 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310865 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310880 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310897 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310915 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310929 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310943 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310959 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310972 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310989 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311004 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311027 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311048 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311063 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311079 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311095 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311109 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311124 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311139 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311153 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311167 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311181 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311196 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311230 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311244 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311259 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311275 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311289 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311305 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311320 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311335 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311350 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311426 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311448 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311465 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311480 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311515 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311531 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311547 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311561 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311577 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311592 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311607 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311621 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311638 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311658 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311708 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311724 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311739 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311754 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311769 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311784 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311801 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311817 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311832 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311847 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311862 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311878 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311894 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311910 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311928 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311944 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311958 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311974 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312024 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312042 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312058 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312074 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312089 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312105 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312139 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312154 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312169 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312184 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312200 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312218 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312233 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312248 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312263 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312279 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312294 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312312 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312328 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312343 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312361 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312382 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312406 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312423 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312439 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312455 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312471 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312486 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312501 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312532 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312549 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312564 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312579 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312595 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312611 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312626 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312642 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312665 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312692 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312709 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312726 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312743 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312775 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312791 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312808 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312823 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312839 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312856 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312872 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312888 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312921 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312938 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312954 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312970 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312987 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313003 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313020 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313037 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313054 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313071 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313088 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313105 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313121 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313137 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313155 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313171 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313209 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313236 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313254 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313271 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313292 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313312 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313347 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313369 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313395 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313415 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313433 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313453 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313469 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313514 4720 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318246 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.330088 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.331821 4720 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310187 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310403 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310569 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310790 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.310926 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311063 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311223 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.311709 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312060 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312211 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312566 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.312957 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313170 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313354 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313453 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313528 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.313894 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314106 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314231 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345825 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345843 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314702 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314714 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314763 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314842 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314904 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315024 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315124 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315200 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315504 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315621 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.315955 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316010 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316208 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316229 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346704 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316351 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316365 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316481 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.316761 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.317663 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318144 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318302 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318372 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318470 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318545 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318632 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318751 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318828 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.318857 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.321771 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322159 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322166 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322342 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322350 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.322374 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323008 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323209 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323374 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.323480 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324364 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324788 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324952 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.324998 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325176 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325483 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325734 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325783 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325880 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.325986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326009 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326165 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326165 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326224 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326338 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326424 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326461 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326504 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326617 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326729 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.326802 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327160 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327340 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327379 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327499 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327556 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327596 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327857 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.327870 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328154 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328253 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328523 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328575 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328861 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328898 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.328936 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.329581 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.329891 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.330341 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.330535 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.338745 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.340123 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.340466 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341214 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341409 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341405 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341805 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.342469 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345140 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345154 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345344 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345395 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.345542 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.314436 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346030 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346403 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346479 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346543 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.346784 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347167 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347342 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347353 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347498 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347519 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347729 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347804 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.347954 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348072 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348388 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348689 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348834 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.348849 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350553 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350734 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350813 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350913 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.350876 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351080 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351294 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.351376 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.851222441 +0000 UTC m=+18.759962373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351423 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.351918 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.352134 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.352241 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.352475 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.353582 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354397 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354487 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354760 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.354826 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341957 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.355382 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.355459 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.355690 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.356092 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.358566 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.358579 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359395 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359609 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359731 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.358001 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.363083 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.359488 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.364827 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.365093 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.365929 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.366146 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.366460 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.366723 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.341786 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.364640 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.371449 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.372035 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.372222 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.372750 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.373205 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.373516 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.374018 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.374339 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.375070 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.375392 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.375478 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.371353 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.371904 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.377359 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.377501 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.373791 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.377758 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.377930 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.378011 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.378080 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.379789 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.380625 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.377294 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.380697 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.877574565 +0000 UTC m=+18.786314587 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.380734 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.880717527 +0000 UTC m=+18.789457459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.380749 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.382361 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382416 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382430 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382442 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382519 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.882497789 +0000 UTC m=+18.791237801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.382553 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:40.882545801 +0000 UTC m=+18.791285823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.384814 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.386121 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.386433 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.387021 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.392775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.394201 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.403850 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414801 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414873 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414929 4720 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414942 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414952 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414961 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414968 4720 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414977 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414985 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.414994 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415003 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415011 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415020 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415027 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415036 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415046 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415057 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415067 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415076 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415084 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415092 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415100 4720 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415107 4720 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415115 4720 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415123 4720 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415132 4720 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415140 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415164 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415172 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415182 4720 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415191 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415200 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415208 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415234 4720 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415242 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415249 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415257 4720 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415266 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415274 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415282 4720 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415292 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415301 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415309 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415317 4720 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415325 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415335 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415344 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415366 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415374 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415382 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415390 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415399 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415407 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415415 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415423 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415433 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415449 4720 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415467 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415480 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415509 4720 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415519 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415529 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415541 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415549 4720 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415557 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415549 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415566 4720 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415624 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415644 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415664 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415694 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415708 4720 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415720 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415743 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415755 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415768 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415780 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415794 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415806 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415821 4720 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415836 4720 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415848 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415859 4720 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415871 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415884 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415896 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415910 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415922 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415934 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415945 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415957 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415968 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415980 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.415992 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416004 4720 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416016 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416027 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416038 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416050 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416062 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416074 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416086 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416099 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416112 4720 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416124 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416136 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416149 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416159 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416169 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416180 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416192 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416204 4720 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416221 4720 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416234 4720 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416248 4720 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416258 4720 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416268 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416278 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416288 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416298 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416309 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416319 4720 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416616 4720 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416629 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416643 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416666 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418432 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418531 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418590 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.418640 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420284 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420307 4720 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420320 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420333 4720 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420345 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420359 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420370 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420385 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420398 4720 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420409 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420422 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420433 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420447 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420460 4720 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420472 4720 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420484 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420495 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420506 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420517 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420544 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420555 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420567 4720 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420579 4720 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420590 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420601 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420612 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420624 4720 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420635 4720 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420646 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420682 4720 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420696 4720 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420707 4720 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420720 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420731 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420742 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420754 4720 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420764 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420775 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420787 4720 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420798 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420810 4720 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420821 4720 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420834 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420845 4720 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420857 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420869 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420881 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420894 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420906 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420917 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420928 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420940 4720 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420956 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420968 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420980 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.420991 4720 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421002 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421015 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421026 4720 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421037 4720 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421048 4720 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.421059 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.417044 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.416325 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.423817 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.426046 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.428300 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.428653 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.438613 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.448241 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.457856 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.469287 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.480617 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.489581 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.501334 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521627 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521673 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521684 4720 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.521696 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.606329 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 21 14:29:40 crc kubenswrapper[4720]: W0121 14:29:40.618608 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3 WatchSource:0}: Error finding container dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3: Status 404 returned error can't find the container with id dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.628374 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 21 14:29:40 crc kubenswrapper[4720]: W0121 14:29:40.640001 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace WatchSource:0}: Error finding container 658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace: Status 404 returned error can't find the container with id 658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.675414 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.675451 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.682381 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.683267 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.684723 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.686123 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.686722 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.687756 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.688244 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.688395 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: W0121 14:29:40.688862 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6 WatchSource:0}: Error finding container ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6: Status 404 returned error can't find the container with id ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.689051 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.690067 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.690547 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.695309 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.700588 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.701088 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.702329 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.702633 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.702828 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.703729 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.704253 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.704636 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.706429 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.707102 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.707613 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.713000 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.713462 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.714593 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.714988 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.716208 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.717125 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.718150 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.718207 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.719439 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.720545 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.721765 4720 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.721883 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.723499 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.724091 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.724891 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.726688 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.727315 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.727736 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.728163 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.728747 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.729864 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.730289 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.731970 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.732537 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.733851 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.734430 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.735422 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.736307 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.737509 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.737976 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.738799 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.739384 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.740462 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.741013 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.741422 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.741788 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.742611 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.742645 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.751356 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.757680 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ff7fa90457235efe70031af89f119c7b173bc034b46bd828c0f7216ac3ad5cc6"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.758616 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"658a7cc518b890647895600c4fba948c34c0f90c1d63341097a52d5f005f4ace"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.759267 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"dcc98c6d0530f6616d1fee7d801e4b7de085f11577f38f7289bbbd7c5302fae3"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.760879 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.762896 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754" exitCode=255 Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.762934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754"} Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.764910 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.777594 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.788505 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.790175 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.791363 4720 scope.go:117] "RemoveContainer" containerID="f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.801581 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.813799 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.827582 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.839633 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.850112 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.860105 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-21 14:24:39 +0000 UTC, rotation deadline is 2026-11-11 09:26:44.839555046 +0000 UTC Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.860153 4720 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7050h57m3.979404603s for next certificate rotation Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.867687 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.878262 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.891825 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.906165 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.916088 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.924982 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925050 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925073 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925095 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.925114 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925180 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925153862 +0000 UTC m=+19.833893794 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925234 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925283 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925321 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925269475 +0000 UTC m=+19.834009407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925329 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925361 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925406 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925393969 +0000 UTC m=+19.834133971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925451 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925498 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925514 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925461 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925577 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925556334 +0000 UTC m=+19.834296336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:40 crc kubenswrapper[4720]: E0121 14:29:40.925598 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:41.925588814 +0000 UTC m=+19.834328746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.926826 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:40 crc kubenswrapper[4720]: I0121 14:29:40.938358 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.302456 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 08:55:04.753390073 +0000 UTC Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.367891 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-k4qfb"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.368194 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.370031 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.373623 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.375212 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.419610 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.429341 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq9z7\" (UniqueName: \"kubernetes.io/projected/d24af441-df03-462d-914a-165777766cf4-kube-api-access-vq9z7\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.429378 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d24af441-df03-462d-914a-165777766cf4-hosts-file\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.454297 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.481165 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.509161 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.530004 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d24af441-df03-462d-914a-165777766cf4-hosts-file\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.530063 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq9z7\" (UniqueName: \"kubernetes.io/projected/d24af441-df03-462d-914a-165777766cf4-kube-api-access-vq9z7\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.530141 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d24af441-df03-462d-914a-165777766cf4-hosts-file\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.547467 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.566963 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq9z7\" (UniqueName: \"kubernetes.io/projected/d24af441-df03-462d-914a-165777766cf4-kube-api-access-vq9z7\") pod \"node-resolver-k4qfb\" (UID: \"d24af441-df03-462d-914a-165777766cf4\") " pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.577175 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.600135 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.618666 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.648285 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.677291 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.677302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.677520 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.677405 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.690366 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4qfb" Jan 21 14:29:41 crc kubenswrapper[4720]: W0121 14:29:41.702469 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24af441_df03_462d_914a_165777766cf4.slice/crio-36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be WatchSource:0}: Error finding container 36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be: Status 404 returned error can't find the container with id 36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.774908 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4qfb" event={"ID":"d24af441-df03-462d-914a-165777766cf4","Type":"ContainerStarted","Data":"36032456655a7ede9461e8a27523b83f4363e842c4a0eb061ed1e93b930a06be"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.776895 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.782612 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.782960 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.784357 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"375c1a86c17f5272e6bc4717537fe63a4c86280b1a292edd1676035544d5c4ee"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.786604 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e0298c3b240af732b45430e55df1591c7995466acad5aa3551a12a74b6f7b06f"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.786638 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7768c6c697364cf9f063ee60e27385f44ea0f734c4c908282d9643056742b93d"} Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.817956 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.818918 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2pbsk"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.819244 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.820608 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5r9wf"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.821097 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.826354 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.826532 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.827557 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-w85dm"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.827871 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.833985 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.834701 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.835061 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849512 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849787 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849836 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.849902 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.850141 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.850343 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.855751 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.856771 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.860960 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.864049 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866457 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866745 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866866 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.866944 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.867062 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.867251 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.867425 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.897481 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.915249 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.931492 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933568 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933680 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933709 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933730 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933751 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-socket-dir-parent\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933770 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2xr\" (UniqueName: \"kubernetes.io/projected/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-kube-api-access-dz2xr\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933790 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933810 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-netns\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933827 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-hostroot\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1128ddd-06c2-4255-aa17-b62aa0f8a996-proxy-tls\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933874 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-conf-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933896 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-system-cni-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933916 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cnibin\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933934 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933953 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933971 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-k8s-cni-cncf-io\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.933991 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-daemon-config\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934011 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934029 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934045 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cnibin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934216 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934242 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934271 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cni-binary-copy\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934287 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1128ddd-06c2-4255-aa17-b62aa0f8a996-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934302 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm9d5\" (UniqueName: \"kubernetes.io/projected/c1128ddd-06c2-4255-aa17-b62aa0f8a996-kube-api-access-vm9d5\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934316 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-os-release\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934330 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934347 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-multus\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934364 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934379 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-os-release\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934408 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-kubelet\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934426 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8fhj\" (UniqueName: \"kubernetes.io/projected/14cdc412-e60b-4b9b-b37d-33b1f061f44d-kube-api-access-w8fhj\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934440 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934470 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934490 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934508 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934538 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934553 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934572 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-etc-kubernetes\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-bin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934610 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-multus-certs\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934625 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-binary-copy\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934639 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934654 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934690 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934713 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934732 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934746 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-system-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.934760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1128ddd-06c2-4255-aa17-b62aa0f8a996-rootfs\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.934848 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.934834233 +0000 UTC m=+21.843574165 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.934892 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.934919 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.934913955 +0000 UTC m=+21.843653887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935111 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935124 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935133 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935185 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.935177553 +0000 UTC m=+21.843917485 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935278 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935293 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935299 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935317 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.935312156 +0000 UTC m=+21.844052098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935442 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: E0121 14:29:41.935461 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:43.93545597 +0000 UTC m=+21.844195902 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.946510 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.963250 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.985670 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:41 crc kubenswrapper[4720]: I0121 14:29:41.997520 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:41Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.009594 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375c1a86c17f5272e6bc4717537fe63a4c86280b1a292edd1676035544d5c4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.020945 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.031956 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-etc-kubernetes\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-bin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035147 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-multus-certs\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035170 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-binary-copy\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035186 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035204 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035218 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035235 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035244 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035288 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-system-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035251 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-system-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035243 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-multus-certs\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035214 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-etc-kubernetes\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1128ddd-06c2-4255-aa17-b62aa0f8a996-rootfs\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035339 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c1128ddd-06c2-4255-aa17-b62aa0f8a996-rootfs\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035353 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035355 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035392 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035371 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035439 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-socket-dir-parent\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035461 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2xr\" (UniqueName: \"kubernetes.io/projected/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-kube-api-access-dz2xr\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035477 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035491 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-netns\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035505 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-hostroot\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035521 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1128ddd-06c2-4255-aa17-b62aa0f8a996-proxy-tls\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035539 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-conf-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035559 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-daemon-config\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035582 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-system-cni-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035597 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cnibin\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035612 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035631 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-k8s-cni-cncf-io\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035661 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035711 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035728 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cnibin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035769 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035787 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035804 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cni-binary-copy\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035820 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1128ddd-06c2-4255-aa17-b62aa0f8a996-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035836 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm9d5\" (UniqueName: \"kubernetes.io/projected/c1128ddd-06c2-4255-aa17-b62aa0f8a996-kube-api-access-vm9d5\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-binary-copy\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035862 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-os-release\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035916 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035935 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-multus\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035951 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035958 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035970 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-os-release\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.035997 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-kubelet\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036018 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8fhj\" (UniqueName: \"kubernetes.io/projected/14cdc412-e60b-4b9b-b37d-33b1f061f44d-kube-api-access-w8fhj\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036041 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036081 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036096 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036110 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036140 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036155 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036172 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036189 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-hostroot\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036235 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-os-release\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036268 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036288 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036290 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-k8s-cni-cncf-io\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036312 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036332 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-socket-dir-parent\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036371 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-kubelet\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036499 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036522 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036538 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-cni-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036543 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036566 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-run-netns\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036765 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-bin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036798 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036837 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.036879 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-os-release\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037166 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-host-var-lib-cni-multus\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-daemon-config\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037217 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-multus-conf-dir\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037259 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cnibin\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037401 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037560 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c1128ddd-06c2-4255-aa17-b62aa0f8a996-mcd-auth-proxy-config\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037600 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037639 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037739 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-system-cni-dir\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037755 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/14cdc412-e60b-4b9b-b37d-33b1f061f44d-cnibin\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.037898 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-cni-binary-copy\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.039970 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.040484 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c1128ddd-06c2-4255-aa17-b62aa0f8a996-proxy-tls\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.048757 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14cdc412-e60b-4b9b-b37d-33b1f061f44d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5r9wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.055623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm9d5\" (UniqueName: \"kubernetes.io/projected/c1128ddd-06c2-4255-aa17-b62aa0f8a996-kube-api-access-vm9d5\") pod \"machine-config-daemon-2pbsk\" (UID: \"c1128ddd-06c2-4255-aa17-b62aa0f8a996\") " pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.056293 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ovnkube-node-zr5bd\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.057861 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2xr\" (UniqueName: \"kubernetes.io/projected/a40805c6-ef8a-4ae0-bb5b-1834d257e8c6-kube-api-access-dz2xr\") pod \"multus-w85dm\" (UID: \"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\") " pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.058124 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8fhj\" (UniqueName: \"kubernetes.io/projected/14cdc412-e60b-4b9b-b37d-33b1f061f44d-kube-api-access-w8fhj\") pod \"multus-additional-cni-plugins-5r9wf\" (UID: \"14cdc412-e60b-4b9b-b37d-33b1f061f44d\") " pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.067949 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w85dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz2xr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w85dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.078152 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.086511 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.105508 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.119761 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0298c3b240af732b45430e55df1591c7995466acad5aa3551a12a74b6f7b06f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7768c6c697364cf9f063ee60e27385f44ea0f734c4c908282d9643056742b93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.130437 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.134393 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.144083 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1128ddd_06c2_4255_aa17_b62aa0f8a996.slice/crio-c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b WatchSource:0}: Error finding container c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b: Status 404 returned error can't find the container with id c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.144948 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.156975 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w85dm" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.157278 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14cdc412_e60b_4b9b_b37d_33b1f061f44d.slice/crio-6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee WatchSource:0}: Error finding container 6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee: Status 404 returned error can't find the container with id 6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.163900 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1128ddd-06c2-4255-aa17-b62aa0f8a996\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pbsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.172434 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.172854 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda40805c6_ef8a_4ae0_bb5b_1834d257e8c6.slice/crio-a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79 WatchSource:0}: Error finding container a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79: Status 404 returned error can't find the container with id a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79 Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.191919 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zr5bd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.196645 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac61c15b_6fe9_4c83_9ca7_588095ab1a29.slice/crio-ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e WatchSource:0}: Error finding container ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e: Status 404 returned error can't find the container with id ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.220395 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.227715 4720 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228041 4720 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228071 4720 reflector.go:484] object-"openshift-multus"/"default-dockercfg-2q5b6": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"default-dockercfg-2q5b6": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228092 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228119 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228128 4720 reflector.go:484] object-"openshift-multus"/"multus-daemon-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"multus-daemon-config": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228132 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228150 4720 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228168 4720 reflector.go:484] object-"openshift-multus"/"cni-copy-resources": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"cni-copy-resources": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233102 4720 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233128 4720 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233141 4720 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233295 4720 reflector.go:484] object-"openshift-multus"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233545 4720 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.233834 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": watch of *v1.Secret ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234008 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234045 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-config": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-config": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234079 4720 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234135 4720 reflector.go:484] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234157 4720 reflector.go:484] object-"openshift-multus"/"default-cni-sysctl-allowlist": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"default-cni-sysctl-allowlist": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234532 4720 reflector.go:484] object-"openshift-multus"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-multus"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.234610 4720 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: W0121 14:29:42.228044 4720 reflector.go:484] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": watch of *v1.Secret ended with: very short watch: object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": Unexpected watch close - watch lasted less than a second and no items received Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.304746 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:11:14.422962604 +0000 UTC Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.378920 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.404207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.409288 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0298c3b240af732b45430e55df1591c7995466acad5aa3551a12a74b6f7b06f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7768c6c697364cf9f063ee60e27385f44ea0f734c4c908282d9643056742b93d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.441327 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.474388 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.486990 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1128ddd-06c2-4255-aa17-b62aa0f8a996\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vm9d5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2pbsk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.564280 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvf2r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zr5bd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.598015 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.629012 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://375c1a86c17f5272e6bc4717537fe63a4c86280b1a292edd1676035544d5c4ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.679591 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.679704 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: E0121 14:29:42.679763 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.722745 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.753836 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14cdc412-e60b-4b9b-b37d-33b1f061f44d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w8fhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5r9wf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.777570 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-w85dm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dz2xr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-multus\"/\"multus-w85dm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.795279 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ef0f5a17ed649013616aa7b2d7e65b892f61c734ceec9ad1d7443d10876af78e"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.799863 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4qfb" event={"ID":"d24af441-df03-462d-914a-165777766cf4","Type":"ContainerStarted","Data":"c3482bd848f0a6c664ad2fc29e050d1dd9f33ac0c4233db1f9c778f22060b4b7"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.801379 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerStarted","Data":"3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.801414 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerStarted","Data":"a6517f927ed93f2b097ba1c364eef63a89682d8194fb196d8a1b774255191b79"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.804048 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"e987849aae01f0808d10da2fa7a849ecf45678e5acbdff5e43105398fd5e192a"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.804082 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.804093 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"c77903ace690b2415d0afa0b85b0180f68914da3564489e726e04c99d3930f4b"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.805192 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" exitCode=0 Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.805241 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.805256 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.808617 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ab4db5b-b639-46bf-b87c-053109420c2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b66da070ca2e79fa6013abc854fba85856cc805366bb90b35c7c41f38c4ee362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4a61cfef1d25e81c7dd55aaf21e827449c2eb622097df0d924e5e2dfebc41d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7607a1d64a04f656ff3b4f81c9b30350791bdc08b7a909de30e247eee4422dc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.823396 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="c51909f5d4fb259c3b11d7389e46f2e135ead03f40bfbbd5807e38246de5e808" exitCode=0 Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.823912 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"c51909f5d4fb259c3b11d7389e46f2e135ead03f40bfbbd5807e38246de5e808"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.823940 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerStarted","Data":"6d28cafbb02a492ec5764de06716b90abe724a20b5f175b9b2a9240735e637ee"} Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.825834 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: E0121 14:29:42.838937 4720 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.852926 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.895226 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"41e4ae1e-77b8-40b8-9f64-1eba5a39188a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-21T14:29:40Z\\\",\\\"message\\\":\\\"le observer\\\\nW0121 14:29:39.836718 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0121 14:29:39.836860 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0121 14:29:39.841807 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-4029814202/tls.crt::/tmp/serving-cert-4029814202/tls.key\\\\\\\"\\\\nI0121 14:29:40.119738 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0121 14:29:40.121790 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0121 14:29:40.121812 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0121 14:29:40.121864 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0121 14:29:40.121871 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0121 14:29:40.137231 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0121 14:29:40.137263 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137268 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0121 14:29:40.137274 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0121 14:29:40.137279 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0121 14:29:40.137282 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0121 14:29:40.137286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0121 14:29:40.137463 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0121 14:29:40.138686 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-21T14:29:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-21T14:29:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.924627 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qfb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24af441-df03-462d-914a-165777766cf4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3482bd848f0a6c664ad2fc29e050d1dd9f33ac0c4233db1f9c778f22060b4b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vq9z7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-21T14:29:41Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qfb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:42 crc kubenswrapper[4720]: I0121 14:29:42.960828 4720 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-21T14:29:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef0f5a17ed649013616aa7b2d7e65b892f61c734ceec9ad1d7443d10876af78e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-21T14:29:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-21T14:29:42Z is after 2025-08-24T17:21:41Z" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.010074 4720 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013330 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013364 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013372 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.013457 4720 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.030796 4720 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.031126 4720 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032199 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032240 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032253 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032269 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.032282 4720 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-21T14:29:43Z","lastTransitionTime":"2026-01-21T14:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.079098 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podStartSLOduration=2.079082981 podStartE2EDuration="2.079082981s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.078950818 +0000 UTC m=+20.987690770" watchObservedRunningTime="2026-01-21 14:29:43.079082981 +0000 UTC m=+20.987822903" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.098421 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.102269 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.102582 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104451 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104680 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104822 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.104952 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.142274 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153807 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153855 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153891 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153916 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.153959 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.167894 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.167876442 podStartE2EDuration="1.167876442s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.167302125 +0000 UTC m=+21.076042057" watchObservedRunningTime="2026-01-21 14:29:43.167876442 +0000 UTC m=+21.076616384" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.186502 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.214294 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.214271223 podStartE2EDuration="3.214271223s" podCreationTimestamp="2026-01-21 14:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.189441576 +0000 UTC m=+21.098181528" watchObservedRunningTime="2026-01-21 14:29:43.214271223 +0000 UTC m=+21.123011175" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.236653 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254641 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254713 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254745 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254760 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.254779 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.255449 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.255719 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.255761 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.259024 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.260900 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x5ldg"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.261208 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.262933 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.263151 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.263532 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.264272 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.274778 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e6f177bb-4eff-4b46-bc6b-0712b4b787ac-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hjmcc\" (UID: \"e6f177bb-4eff-4b46-bc6b-0712b4b787ac\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.306143 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 07:45:35.116189346 +0000 UTC Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.306200 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.332635 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.334046 4720 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.336719 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=3.336700396 podStartE2EDuration="3.336700396s" podCreationTimestamp="2026-01-21 14:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.336570542 +0000 UTC m=+21.245310494" watchObservedRunningTime="2026-01-21 14:29:43.336700396 +0000 UTC m=+21.245440328" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.337117 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w85dm" podStartSLOduration=2.337110997 podStartE2EDuration="2.337110997s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.299192604 +0000 UTC m=+21.207932546" watchObservedRunningTime="2026-01-21 14:29:43.337110997 +0000 UTC m=+21.245850929" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.355575 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/790bf9ea-decc-4a7a-b349-bf7358d50842-host\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.355613 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/790bf9ea-decc-4a7a-b349-bf7358d50842-serviceca\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.355638 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2247n\" (UniqueName: \"kubernetes.io/projected/790bf9ea-decc-4a7a-b349-bf7358d50842-kube-api-access-2247n\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.366510 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k4qfb" podStartSLOduration=2.366493261 podStartE2EDuration="2.366493261s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:43.36536749 +0000 UTC m=+21.274107422" watchObservedRunningTime="2026-01-21 14:29:43.366493261 +0000 UTC m=+21.275233193" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.412153 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.414060 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.418550 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.427427 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.445625 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/790bf9ea-decc-4a7a-b349-bf7358d50842-host\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456200 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/790bf9ea-decc-4a7a-b349-bf7358d50842-serviceca\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456219 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2247n\" (UniqueName: \"kubernetes.io/projected/790bf9ea-decc-4a7a-b349-bf7358d50842-kube-api-access-2247n\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.456479 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/790bf9ea-decc-4a7a-b349-bf7358d50842-host\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.457273 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/790bf9ea-decc-4a7a-b349-bf7358d50842-serviceca\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.470306 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.481296 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2247n\" (UniqueName: \"kubernetes.io/projected/790bf9ea-decc-4a7a-b349-bf7358d50842-kube-api-access-2247n\") pod \"node-ca-x5ldg\" (UID: \"790bf9ea-decc-4a7a-b349-bf7358d50842\") " pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.547823 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.560701 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.561075 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.562830 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.563165 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.572329 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x5ldg" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.586949 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: W0121 14:29:43.591751 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790bf9ea_decc_4a7a_b349_bf7358d50842.slice/crio-3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca WatchSource:0}: Error finding container 3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca: Status 404 returned error can't find the container with id 3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.596173 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-x48m6"] Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.596633 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.596717 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.608019 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.647401 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658159 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5mq7\" (UniqueName: \"kubernetes.io/projected/139c8416-e015-49e4-adfe-32f9e142621f-kube-api-access-m5mq7\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658355 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lfqm\" (UniqueName: \"kubernetes.io/projected/b6c8f4e3-ac08-4482-b686-a4b1618e051d-kube-api-access-2lfqm\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658566 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658677 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.658799 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.677243 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.677286 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.677356 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.677710 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.687395 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.708729 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.728116 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.750034 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759422 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759497 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lfqm\" (UniqueName: \"kubernetes.io/projected/b6c8f4e3-ac08-4482-b686-a4b1618e051d-kube-api-access-2lfqm\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759523 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5mq7\" (UniqueName: \"kubernetes.io/projected/139c8416-e015-49e4-adfe-32f9e142621f-kube-api-access-m5mq7\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759551 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.759602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.759841 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.759896 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:44.259881043 +0000 UTC m=+22.168621015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.760736 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.760748 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.765688 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b6c8f4e3-ac08-4482-b686-a4b1618e051d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.784497 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5mq7\" (UniqueName: \"kubernetes.io/projected/139c8416-e015-49e4-adfe-32f9e142621f-kube-api-access-m5mq7\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.800544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lfqm\" (UniqueName: \"kubernetes.io/projected/b6c8f4e3-ac08-4482-b686-a4b1618e051d-kube-api-access-2lfqm\") pod \"ovnkube-control-plane-749d76644c-sdvtx\" (UID: \"b6c8f4e3-ac08-4482-b686-a4b1618e051d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.807195 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.827053 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.830059 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="33c8d4e7303bd3b7659ada685627eec03fed3192711e42d04c1b4ba547abb7d7" exitCode=0 Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.830150 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"33c8d4e7303bd3b7659ada685627eec03fed3192711e42d04c1b4ba547abb7d7"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.832076 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x5ldg" event={"ID":"790bf9ea-decc-4a7a-b349-bf7358d50842","Type":"ContainerStarted","Data":"3eb5b5e8d7905bfe5fb6b9ac2299a87f1ec86126ebf1c77d4e712a04bf1c45ca"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.835439 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" event={"ID":"e6f177bb-4eff-4b46-bc6b-0712b4b787ac","Type":"ContainerStarted","Data":"dc9f16dddb9a855e83cdad9b82369b603c8b6a1148856dd1c05e19ff3e26d54f"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842012 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842049 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842061 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.842072 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.847328 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.867775 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.960787 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.960927 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.960901921 +0000 UTC m=+25.869641863 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961136 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961242 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:43 crc kubenswrapper[4720]: I0121 14:29:43.961329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.961750 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.961805 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.961794136 +0000 UTC m=+25.870534078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962543 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962568 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962582 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.962620 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.962606158 +0000 UTC m=+25.871346100 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.963632 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.963705 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.963693689 +0000 UTC m=+25.872433631 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964259 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964288 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964299 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:43 crc kubenswrapper[4720]: E0121 14:29:43.964374 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.964357657 +0000 UTC m=+25.873097589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.075708 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" Jan 21 14:29:44 crc kubenswrapper[4720]: W0121 14:29:44.098924 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6c8f4e3_ac08_4482_b686_a4b1618e051d.slice/crio-6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70 WatchSource:0}: Error finding container 6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70: Status 404 returned error can't find the container with id 6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70 Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.265726 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:44 crc kubenswrapper[4720]: E0121 14:29:44.265832 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:44 crc kubenswrapper[4720]: E0121 14:29:44.265875 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:45.265861243 +0000 UTC m=+23.174601175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.677204 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:44 crc kubenswrapper[4720]: E0121 14:29:44.677616 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.850774 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" event={"ID":"e6f177bb-4eff-4b46-bc6b-0712b4b787ac","Type":"ContainerStarted","Data":"2ea9fd8cccf96a7a49117394d251bbe2af6588e5015076c31ed1c46098fcf8c7"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.853881 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.855799 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="09018c7b53b0ac3ffa11afab29d27968cab8b22bd7df3ef7ac95f773b2bca6c5" exitCode=0 Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.855870 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"09018c7b53b0ac3ffa11afab29d27968cab8b22bd7df3ef7ac95f773b2bca6c5"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.857068 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x5ldg" event={"ID":"790bf9ea-decc-4a7a-b349-bf7358d50842","Type":"ContainerStarted","Data":"d5d7991d26c12ced440dcb34bb4789ce5ba5f14b99bdda7210328b02094ca76d"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.861985 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" event={"ID":"b6c8f4e3-ac08-4482-b686-a4b1618e051d","Type":"ContainerStarted","Data":"4d7058ee5b191ba3e2eda1651d9fbfe809e9755faa96a201cefe8834b47387b9"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.862035 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" event={"ID":"b6c8f4e3-ac08-4482-b686-a4b1618e051d","Type":"ContainerStarted","Data":"6140416b33ea797306d905b7a77da994c647560417081684c7ca5b22ff2dfe70"} Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.865543 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hjmcc" podStartSLOduration=3.865524299 podStartE2EDuration="3.865524299s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:44.864314765 +0000 UTC m=+22.773054717" watchObservedRunningTime="2026-01-21 14:29:44.865524299 +0000 UTC m=+22.774264241" Jan 21 14:29:44 crc kubenswrapper[4720]: I0121 14:29:44.907435 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x5ldg" podStartSLOduration=2.907413724 podStartE2EDuration="2.907413724s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:44.906730705 +0000 UTC m=+22.815470657" watchObservedRunningTime="2026-01-21 14:29:44.907413724 +0000 UTC m=+22.816153656" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.276912 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.277110 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.277161 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:47.277145462 +0000 UTC m=+25.185885394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.677850 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.677933 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.677944 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.677968 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.678013 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:45 crc kubenswrapper[4720]: E0121 14:29:45.678097 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.870664 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="550b7d659a2217ce1d8c62b141c2b73ef539f9cc9abef756b517e7d2290bc55f" exitCode=0 Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.870705 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"550b7d659a2217ce1d8c62b141c2b73ef539f9cc9abef756b517e7d2290bc55f"} Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.872259 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" event={"ID":"b6c8f4e3-ac08-4482-b686-a4b1618e051d","Type":"ContainerStarted","Data":"4b529a8be01df9ff160bd230126312dcee298bbd33c9ad49582dc39b4fe7b034"} Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.877559 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} Jan 21 14:29:45 crc kubenswrapper[4720]: I0121 14:29:45.901303 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sdvtx" podStartSLOduration=3.901285176 podStartE2EDuration="3.901285176s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:45.899105784 +0000 UTC m=+23.807845746" watchObservedRunningTime="2026-01-21 14:29:45.901285176 +0000 UTC m=+23.810025098" Jan 21 14:29:46 crc kubenswrapper[4720]: I0121 14:29:46.677444 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:46 crc kubenswrapper[4720]: E0121 14:29:46.677569 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:46 crc kubenswrapper[4720]: I0121 14:29:46.886347 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="c112b8b02df124c91a8f0307eb2756328ec9a65a83a3ddc4c767bc2a57c58335" exitCode=0 Jan 21 14:29:46 crc kubenswrapper[4720]: I0121 14:29:46.886426 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"c112b8b02df124c91a8f0307eb2756328ec9a65a83a3ddc4c767bc2a57c58335"} Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.296793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.296934 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.296986 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:51.296972885 +0000 UTC m=+29.205712817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.677911 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.677943 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.677907 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.678034 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.678167 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:47 crc kubenswrapper[4720]: E0121 14:29:47.678259 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.899345 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.904961 4720 generic.go:334] "Generic (PLEG): container finished" podID="14cdc412-e60b-4b9b-b37d-33b1f061f44d" containerID="cf8224219befa4cd9404148c59cc21d712fed48c71e842ff810a2f9df97e9301" exitCode=0 Jan 21 14:29:47 crc kubenswrapper[4720]: I0121 14:29:47.905005 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerDied","Data":"cf8224219befa4cd9404148c59cc21d712fed48c71e842ff810a2f9df97e9301"} Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.003922 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004038 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004063 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004137 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004109686 +0000 UTC m=+33.912849628 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004172 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004217 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004235 4720 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004305 4720 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004228 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004310 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004299511 +0000 UTC m=+33.913039553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004374 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004358123 +0000 UTC m=+33.913098125 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.004416 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004534 4720 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004577 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004566438 +0000 UTC m=+33.913306440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004583 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004601 4720 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004614 4720 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.004691 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-21 14:29:56.004679971 +0000 UTC m=+33.913419963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.678149 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:48 crc kubenswrapper[4720]: E0121 14:29:48.678479 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.911229 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" event={"ID":"14cdc412-e60b-4b9b-b37d-33b1f061f44d","Type":"ContainerStarted","Data":"dcbe475b94875f9187e3876f749e33c56cdbc98bbdb7843109a51d6b85182eeb"} Jan 21 14:29:48 crc kubenswrapper[4720]: I0121 14:29:48.933144 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5r9wf" podStartSLOduration=7.933129758 podStartE2EDuration="7.933129758s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:48.930261068 +0000 UTC m=+26.839001010" watchObservedRunningTime="2026-01-21 14:29:48.933129758 +0000 UTC m=+26.841869690" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.677977 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.678003 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:49 crc kubenswrapper[4720]: E0121 14:29:49.678112 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:49 crc kubenswrapper[4720]: E0121 14:29:49.678281 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.678002 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:49 crc kubenswrapper[4720]: E0121 14:29:49.678565 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.919837 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerStarted","Data":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.921168 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.921212 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.952172 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podStartSLOduration=8.952153005 podStartE2EDuration="8.952153005s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:29:49.951308301 +0000 UTC m=+27.860048243" watchObservedRunningTime="2026-01-21 14:29:49.952153005 +0000 UTC m=+27.860892937" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.954429 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:49 crc kubenswrapper[4720]: I0121 14:29:49.956413 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:29:50 crc kubenswrapper[4720]: I0121 14:29:50.677480 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:50 crc kubenswrapper[4720]: E0121 14:29:50.677931 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:50 crc kubenswrapper[4720]: I0121 14:29:50.921989 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.336160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.336336 4720 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.336382 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs podName:139c8416-e015-49e4-adfe-32f9e142621f nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.336368103 +0000 UTC m=+37.245108025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs") pod "network-metrics-daemon-x48m6" (UID: "139c8416-e015-49e4-adfe-32f9e142621f") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.678157 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.678191 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.678170 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.678304 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.678417 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:51 crc kubenswrapper[4720]: E0121 14:29:51.678472 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:51 crc kubenswrapper[4720]: I0121 14:29:51.925107 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.400254 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x48m6"] Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.400360 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:52 crc kubenswrapper[4720]: E0121 14:29:52.400453 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.677296 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:52 crc kubenswrapper[4720]: E0121 14:29:52.678574 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:52 crc kubenswrapper[4720]: I0121 14:29:52.982105 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:29:53 crc kubenswrapper[4720]: I0121 14:29:53.677700 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:53 crc kubenswrapper[4720]: E0121 14:29:53.677853 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 21 14:29:53 crc kubenswrapper[4720]: I0121 14:29:53.677970 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:53 crc kubenswrapper[4720]: E0121 14:29:53.678178 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 21 14:29:54 crc kubenswrapper[4720]: I0121 14:29:54.677213 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:54 crc kubenswrapper[4720]: I0121 14:29:54.677255 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:54 crc kubenswrapper[4720]: E0121 14:29:54.677495 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-x48m6" podUID="139c8416-e015-49e4-adfe-32f9e142621f" Jan 21 14:29:54 crc kubenswrapper[4720]: E0121 14:29:54.677376 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.230621 4720 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.231110 4720 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.264738 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h9ckd"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.265409 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.265583 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.265986 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.274681 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.274910 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275525 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275885 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275960 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276035 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.275926 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276123 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276310 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276413 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276477 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276543 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276544 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276822 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.276978 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277071 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277512 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277606 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68kgl"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.277993 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.279315 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.285172 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.285646 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.287559 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.287897 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.288047 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.288138 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.289984 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290055 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-92xp4"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290166 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290450 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.290559 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.291554 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wmxb9"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.292010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.292012 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.294566 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.294837 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.296073 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.296124 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.307899 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308283 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.307988 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308547 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308590 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308548 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308796 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.308947 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309108 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309259 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309005 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309467 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309616 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309707 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309385 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309884 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.309975 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310043 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310073 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310217 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310445 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310010 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310620 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310386 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310410 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310454 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.312503 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v2pht"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.310956 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.311026 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.312905 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.313168 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.317826 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.320068 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.320688 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.321060 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.321611 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.323481 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.323744 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.326778 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pm8dm"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.327469 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.328322 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zvq7p"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.328890 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.329073 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.331626 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h9ckd"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.336477 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.338313 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.340469 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.340725 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-92xp4"] Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.364634 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.365216 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 14:29:55 crc kubenswrapper[4720]: I0121 14:29:55.366614 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.366877 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811635 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.366998 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367043 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811926 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-dir\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811961 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-images\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811991 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812018 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812056 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367079 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812079 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812100 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812123 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pb6\" (UniqueName: \"kubernetes.io/projected/afb1ffca-e30f-47cf-b399-2bd057039b10-kube-api-access-54pb6\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812144 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812159 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812178 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-policies\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367237 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812238 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-auth-proxy-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812259 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812280 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4hm\" (UniqueName: \"kubernetes.io/projected/aa4e660f-7816-4c20-b94c-5f9543d9cbed-kube-api-access-5k4hm\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812296 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8848\" (UniqueName: \"kubernetes.io/projected/4a47e9b4-6318-4f71-9db0-105be2ada134-kube-api-access-j8848\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812316 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812335 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a75d5de-a507-41ca-8206-eae702d16020-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367402 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812368 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-trusted-ca\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812389 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812409 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812428 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812446 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812468 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812488 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367447 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812507 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61315eef-fa85-4828-9668-f6f4b1484453-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812548 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb1ffca-e30f-47cf-b399-2bd057039b10-serving-cert\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812567 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-serving-cert\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812586 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s95x9\" (UniqueName: \"kubernetes.io/projected/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-kube-api-access-s95x9\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812604 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812620 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.367493 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812675 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/afb1ffca-e30f-47cf-b399-2bd057039b10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812736 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812764 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812789 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812816 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812848 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzpl\" (UniqueName: \"kubernetes.io/projected/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-kube-api-access-wxzpl\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812880 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812905 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812934 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-encryption-config\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812962 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.812988 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa4e660f-7816-4c20-b94c-5f9543d9cbed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813017 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-config\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813040 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2b643f-ce1f-45db-ba7f-31a5fc037650-serving-cert\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813064 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813091 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813117 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61315eef-fa85-4828-9668-f6f4b1484453-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813141 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s999\" (UniqueName: \"kubernetes.io/projected/61315eef-fa85-4828-9668-f6f4b1484453-kube-api-access-6s999\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813168 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-client\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813194 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813222 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt92v\" (UniqueName: \"kubernetes.io/projected/1a75d5de-a507-41ca-8206-eae702d16020-kube-api-access-jt92v\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813250 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a47e9b4-6318-4f71-9db0-105be2ada134-serving-cert\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813281 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813341 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-config\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813370 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-service-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbl4\" (UniqueName: \"kubernetes.io/projected/90b6768c-8240-4fc1-a760-59d79a3c1c02-kube-api-access-ppbl4\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813424 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bdr\" (UniqueName: \"kubernetes.io/projected/120bd3b2-5437-4a15-bcc4-32ae06eb7f1f-kube-api-access-m4bdr\") pod \"downloads-7954f5f757-wmxb9\" (UID: \"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f\") " pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813498 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813561 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-config\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813601 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813640 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-machine-approver-tls\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813725 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813756 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813779 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813811 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813869 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813928 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vsbm\" (UniqueName: \"kubernetes.io/projected/aa2b643f-ce1f-45db-ba7f-31a5fc037650-kube-api-access-8vsbm\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:55.368276 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.818302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.818857 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811272 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.811290 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813381 4720 request.go:700] Waited for 1.482992398s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/secrets?fieldSelector=metadata.name%3Dmetrics-tls&limit=500&resourceVersion=0 Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813815 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.813878 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838157 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838334 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838508 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.838754 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.839082 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.840503 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.841109 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.841326 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.841736 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.842213 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.843936 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.844222 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845040 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845155 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845232 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845352 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845524 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845723 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.845903 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.846035 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.846361 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.846542 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.847526 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.848807 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.849566 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd"] Jan 21 14:29:56 crc kubenswrapper[4720]: E0121 14:29:56.850020 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.849987623 +0000 UTC m=+50.758727565 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.851331 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.851830 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.852170 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.852354 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.852586 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.854911 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.855686 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.856064 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.856377 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.857083 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.857641 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.859689 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.861192 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.863628 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.864081 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.864258 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.864755 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.865172 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.865447 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.871782 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.872110 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68kgl"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.872372 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wmxb9"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.874290 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.885198 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.888689 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pm8dm"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.889195 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.889632 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.894427 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.895784 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.896981 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zvq7p"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.898526 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.899400 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v2pht"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.901791 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.922993 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923053 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923086 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-serving-cert\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923119 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s95x9\" (UniqueName: \"kubernetes.io/projected/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-kube-api-access-s95x9\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923152 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/afb1ffca-e30f-47cf-b399-2bd057039b10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923189 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-image-import-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923195 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923223 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923276 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-node-pullsecrets\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923324 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923353 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923377 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923408 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-encryption-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923440 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-client\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923469 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzpl\" (UniqueName: \"kubernetes.io/projected/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-kube-api-access-wxzpl\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923506 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923537 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923559 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-encryption-config\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923586 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-audit-dir\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923618 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa4e660f-7816-4c20-b94c-5f9543d9cbed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923705 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-config\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923733 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2b643f-ce1f-45db-ba7f-31a5fc037650-serving-cert\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923756 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923792 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61315eef-fa85-4828-9668-f6f4b1484453-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923811 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s999\" (UniqueName: \"kubernetes.io/projected/61315eef-fa85-4828-9668-f6f4b1484453-kube-api-access-6s999\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923825 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923847 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-client\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923864 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-audit\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923885 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923903 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt92v\" (UniqueName: \"kubernetes.io/projected/1a75d5de-a507-41ca-8206-eae702d16020-kube-api-access-jt92v\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923921 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a47e9b4-6318-4f71-9db0-105be2ada134-serving-cert\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923940 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923961 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-config\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923980 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.923996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-service-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924013 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbl4\" (UniqueName: \"kubernetes.io/projected/90b6768c-8240-4fc1-a760-59d79a3c1c02-kube-api-access-ppbl4\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924034 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-serving-cert\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924057 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bdr\" (UniqueName: \"kubernetes.io/projected/120bd3b2-5437-4a15-bcc4-32ae06eb7f1f-kube-api-access-m4bdr\") pod \"downloads-7954f5f757-wmxb9\" (UID: \"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f\") " pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924073 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924092 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924114 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-config\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924132 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924152 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924169 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924189 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-machine-approver-tls\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924230 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924246 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924264 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924285 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a655c79-a709-4d61-8209-200b86144e8b-metrics-tls\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924306 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924324 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924342 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vsbm\" (UniqueName: \"kubernetes.io/projected/aa2b643f-ce1f-45db-ba7f-31a5fc037650-kube-api-access-8vsbm\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924361 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-serving-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924380 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924398 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-dir\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924414 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-images\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924433 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924452 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924470 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7tz\" (UniqueName: \"kubernetes.io/projected/9a655c79-a709-4d61-8209-200b86144e8b-kube-api-access-lz7tz\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924491 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m254w\" (UniqueName: \"kubernetes.io/projected/0f685084-f748-4a34-9020-4d562f2a6d45-kube-api-access-m254w\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924510 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54pb6\" (UniqueName: \"kubernetes.io/projected/afb1ffca-e30f-47cf-b399-2bd057039b10-kube-api-access-54pb6\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924529 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924584 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924606 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-policies\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-auth-proxy-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924648 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4hm\" (UniqueName: \"kubernetes.io/projected/aa4e660f-7816-4c20-b94c-5f9543d9cbed-kube-api-access-5k4hm\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924683 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924718 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-trusted-ca\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924740 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8848\" (UniqueName: \"kubernetes.io/projected/4a47e9b4-6318-4f71-9db0-105be2ada134-kube-api-access-j8848\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924764 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924787 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a75d5de-a507-41ca-8206-eae702d16020-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924808 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924829 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924838 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-njjgs"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.925571 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.925734 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.926105 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.926221 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.927075 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/afb1ffca-e30f-47cf-b399-2bd057039b10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.927539 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.924848 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.927894 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928828 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb1ffca-e30f-47cf-b399-2bd057039b10-serving-cert\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928875 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928942 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61315eef-fa85-4828-9668-f6f4b1484453-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.929185 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.929798 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.929891 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.930114 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.941103 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.941696 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-machine-approver-tls\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.942883 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-serving-cert\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.943144 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.945046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61315eef-fa85-4828-9668-f6f4b1484453-config\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.928771 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.950777 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.951250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.951790 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb1ffca-e30f-47cf-b399-2bd057039b10-serving-cert\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.951913 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.953497 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.954784 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.955046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.955759 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956728 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.962689 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-5qcz5"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.963273 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.964129 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-service-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.960732 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-config\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.960162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a47e9b4-6318-4f71-9db0-105be2ada134-serving-cert\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.955913 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956832 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973457 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973747 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.958429 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-etcd-client\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973837 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-dir\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973984 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.974419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-images\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956125 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.975756 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.974006 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.976147 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.976636 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.978286 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.980022 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.981318 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/90b6768c-8240-4fc1-a760-59d79a3c1c02-audit-policies\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.981787 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-auth-proxy-config\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956206 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956807 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.956850 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.957312 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.983507 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z"] Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.983807 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.973475 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:56 crc kubenswrapper[4720]: I0121 14:29:56.984943 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.004190 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.004502 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.005747 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.005810 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.005834 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.006038 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.006708 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1a75d5de-a507-41ca-8206-eae702d16020-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.007283 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.007509 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.007747 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-trusted-ca\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.008623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt92v\" (UniqueName: \"kubernetes.io/projected/1a75d5de-a507-41ca-8206-eae702d16020-kube-api-access-jt92v\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.008624 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbl4\" (UniqueName: \"kubernetes.io/projected/90b6768c-8240-4fc1-a760-59d79a3c1c02-kube-api-access-ppbl4\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.008747 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.009010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.009137 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.009553 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.011770 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.012346 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.013116 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.013478 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.013852 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.014203 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.014617 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7mfnf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015129 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015254 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015390 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015436 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015538 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015646 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015759 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015990 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.015257 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4hm\" (UniqueName: \"kubernetes.io/projected/aa4e660f-7816-4c20-b94c-5f9543d9cbed-kube-api-access-5k4hm\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016459 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzpl\" (UniqueName: \"kubernetes.io/projected/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-kube-api-access-wxzpl\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016545 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016561 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016487 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016765 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016516 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.016918 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.017340 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.017505 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.017727 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a47e9b4-6318-4f71-9db0-105be2ada134-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.018221 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.018877 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"controller-manager-879f6c89f-9gr25\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019676 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019676 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019826 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.019938 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020076 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020220 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020615 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020783 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.020947 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.021052 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.021186 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.021467 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.022277 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.023386 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lrm9f"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.024236 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.024738 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa2b643f-ce1f-45db-ba7f-31a5fc037650-config\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.025250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027390 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027707 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027728 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.027967 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.028122 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.028409 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.029102 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.030627 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa2b643f-ce1f-45db-ba7f-31a5fc037650-serving-cert\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.030809 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031203 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-image-import-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031255 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-srv-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031273 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdfefc7f-6e59-460a-be36-220a37dd02d1-metrics-tls\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031288 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031328 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-node-pullsecrets\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031344 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfxs\" (UniqueName: \"kubernetes.io/projected/bdfefc7f-6e59-460a-be36-220a37dd02d1-kube-api-access-xvfxs\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031361 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-encryption-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28j2h\" (UniqueName: \"kubernetes.io/projected/f55572f9-fbba-4efa-a6a8-94884f06f9c3-kube-api-access-28j2h\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031412 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-client\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031465 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-audit-dir\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031487 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031504 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-config\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.031761 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a75d5de-a507-41ca-8206-eae702d16020-config\") pod \"machine-api-operator-5694c8668f-h9ckd\" (UID: \"1a75d5de-a507-41ca-8206-eae702d16020\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032011 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032530 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vsbm\" (UniqueName: \"kubernetes.io/projected/aa2b643f-ce1f-45db-ba7f-31a5fc037650-kube-api-access-8vsbm\") pod \"console-operator-58897d9998-68kgl\" (UID: \"aa2b643f-ce1f-45db-ba7f-31a5fc037650\") " pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032570 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-tx54b"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032588 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-image-import-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.032750 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa4e660f-7816-4c20-b94c-5f9543d9cbed-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-mxvkb\" (UID: \"aa4e660f-7816-4c20-b94c-5f9543d9cbed\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.032930 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.532917923 +0000 UTC m=+35.441657845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.033070 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.033398 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/90b6768c-8240-4fc1-a760-59d79a3c1c02-encryption-config\") pod \"apiserver-7bbb656c7d-42snh\" (UID: \"90b6768c-8240-4fc1-a760-59d79a3c1c02\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.033650 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7vkvw"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034014 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s95x9\" (UniqueName: \"kubernetes.io/projected/b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317-kube-api-access-s95x9\") pod \"machine-approver-56656f9798-cnk8x\" (UID: \"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034113 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034425 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.034642 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"console-f9d7485db-42g76\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035082 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035364 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-audit\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035394 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035413 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035621 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07f01852-61b7-4eee-acd6-3d8b8e2b1c85-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-nscjl\" (UID: \"07f01852-61b7-4eee-acd6-3d8b8e2b1c85\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035752 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035785 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-audit-dir\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035842 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-audit\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035893 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-default-certificate\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040512 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040542 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29766114-9e0b-4064-8010-8f426935f834-config\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040561 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-stats-auth\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040577 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040615 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-serving-cert\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040673 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040703 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040721 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29766114-9e0b-4064-8010-8f426935f834-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040740 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040763 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55572f9-fbba-4efa-a6a8-94884f06f9c3-service-ca-bundle\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040786 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a655c79-a709-4d61-8209-200b86144e8b-metrics-tls\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040804 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q4rp\" (UniqueName: \"kubernetes.io/projected/ac33402e-edb9-41ab-bb76-b17108b5ea0d-kube-api-access-2q4rp\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.038229 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54pb6\" (UniqueName: \"kubernetes.io/projected/afb1ffca-e30f-47cf-b399-2bd057039b10-kube-api-access-54pb6\") pod \"openshift-config-operator-7777fb866f-v2pht\" (UID: \"afb1ffca-e30f-47cf-b399-2bd057039b10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040823 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-serving-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29766114-9e0b-4064-8010-8f426935f834-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040922 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7tz\" (UniqueName: \"kubernetes.io/projected/9a655c79-a709-4d61-8209-200b86144e8b-kube-api-access-lz7tz\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040949 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040972 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m254w\" (UniqueName: \"kubernetes.io/projected/0f685084-f748-4a34-9020-4d562f2a6d45-kube-api-access-m254w\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041003 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-metrics-certs\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041049 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041069 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdfefc7f-6e59-460a-be36-220a37dd02d1-config-volume\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041090 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.041113 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fqt6\" (UniqueName: \"kubernetes.io/projected/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-kube-api-access-6fqt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035929 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0f685084-f748-4a34-9020-4d562f2a6d45-node-pullsecrets\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.035972 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036218 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.042264 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.037682 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-encryption-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.037051 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bdr\" (UniqueName: \"kubernetes.io/projected/120bd3b2-5437-4a15-bcc4-32ae06eb7f1f-kube-api-access-m4bdr\") pod \"downloads-7954f5f757-wmxb9\" (UID: \"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f\") " pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.042817 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036328 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.048281 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.038294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-client\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.039410 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/61315eef-fa85-4828-9668-f6f4b1484453-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.040030 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036341 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.049240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"oauth-openshift-558db77b4-7xcc8\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036367 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036406 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036418 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036450 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.051886 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f685084-f748-4a34-9020-4d562f2a6d45-serving-cert\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036556 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036597 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.052515 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-etcd-serving-ca\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.052525 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-config\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.036635 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.037405 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.038026 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.053686 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"route-controller-manager-6576b87f9c-bncrk\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.053814 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a655c79-a709-4d61-8209-200b86144e8b-metrics-tls\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.054171 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8848\" (UniqueName: \"kubernetes.io/projected/4a47e9b4-6318-4f71-9db0-105be2ada134-kube-api-access-j8848\") pod \"authentication-operator-69f744f599-92xp4\" (UID: \"4a47e9b4-6318-4f71-9db0-105be2ada134\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.055413 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.055446 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.055460 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.057169 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.058602 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.060402 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.061052 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f685084-f748-4a34-9020-4d562f2a6d45-trusted-ca-bundle\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.061085 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vh6"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.065143 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j577t"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.065278 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.066178 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.067209 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-njjgs"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.067575 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.068482 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.069470 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.070582 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.071816 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.072767 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.075188 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.076979 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.077208 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7mfnf"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.078402 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.079785 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.080789 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.081938 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.082872 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.083700 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.085887 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.086149 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j577t"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.087002 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.087831 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.087981 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.088577 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.089633 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.090085 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.090492 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lrm9f"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.095859 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vh6"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.098907 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.099858 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7vkvw"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.101602 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.103825 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.107818 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.121551 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.121760 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.130415 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.130785 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.139978 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.140977 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142396 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142665 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142700 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c00abc0-dc46-406c-8f2f-6904ac88126d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142722 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142742 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-kube-api-access-5n4db\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142762 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-stats-auth\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142779 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142795 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-registration-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142815 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29766114-9e0b-4064-8010-8f426935f834-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142831 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61f96497-68d8-4347-b831-f7bc0204c677-signing-key\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142879 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ac39f2f-2411-4585-b15c-c473b2fdc077-proxy-tls\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142898 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142917 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c00abc0-dc46-406c-8f2f-6904ac88126d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97vbj\" (UniqueName: \"kubernetes.io/projected/4e042627-4d69-4cc5-a00d-849fe4ce76f0-kube-api-access-97vbj\") pod \"migrator-59844c95c7-d7wmg\" (UID: \"4e042627-4d69-4cc5-a00d-849fe4ce76f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142978 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5wb\" (UniqueName: \"kubernetes.io/projected/728ae7a4-9793-4555-abbb-b8a352700089-kube-api-access-lj5wb\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.142998 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1796695a-873c-4c15-9351-9b5bc5607830-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143019 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q4rp\" (UniqueName: \"kubernetes.io/projected/ac33402e-edb9-41ab-bb76-b17108b5ea0d-kube-api-access-2q4rp\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143040 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143063 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143087 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29766114-9e0b-4064-8010-8f426935f834-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143107 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1796695a-873c-4c15-9351-9b5bc5607830-proxy-tls\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143124 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-plugins-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143146 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22087b1b-3ded-441f-8349-fb8f38809460-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143169 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143191 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-mountpoint-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143216 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92d3c944-8def-4f95-a3cb-781f929f5f28-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143233 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-socket-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143255 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c868m\" (UniqueName: \"kubernetes.io/projected/e675e6aa-6d61-4490-b768-1dbe664d1dfe-kube-api-access-c868m\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143290 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143307 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-csi-data-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143331 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143349 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdfefc7f-6e59-460a-be36-220a37dd02d1-config-volume\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143363 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143397 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143423 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fqt6\" (UniqueName: \"kubernetes.io/projected/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-kube-api-access-6fqt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143445 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dddnq\" (UniqueName: \"kubernetes.io/projected/8ac39f2f-2411-4585-b15c-c473b2fdc077-kube-api-access-dddnq\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143468 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdfefc7f-6e59-460a-be36-220a37dd02d1-metrics-tls\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143491 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143529 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28j2h\" (UniqueName: \"kubernetes.io/projected/f55572f9-fbba-4efa-a6a8-94884f06f9c3-kube-api-access-28j2h\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143552 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vzw\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-kube-api-access-72vzw\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143577 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd1cfb10-4405-4ab9-8631-690622069d01-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-config\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143643 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4fsv\" (UniqueName: \"kubernetes.io/projected/25067bcc-8503-442b-b348-87d7e1321dbd-kube-api-access-l4fsv\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143699 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssqc7\" (UniqueName: \"kubernetes.io/projected/61f96497-68d8-4347-b831-f7bc0204c677-kube-api-access-ssqc7\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143756 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwqhl\" (UniqueName: \"kubernetes.io/projected/1796695a-873c-4c15-9351-9b5bc5607830-kube-api-access-zwqhl\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143783 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-default-certificate\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143800 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143836 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29766114-9e0b-4064-8010-8f426935f834-config\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143852 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143868 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143884 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xsv\" (UniqueName: \"kubernetes.io/projected/92d3c944-8def-4f95-a3cb-781f929f5f28-kube-api-access-t7xsv\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143919 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-images\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143963 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55572f9-fbba-4efa-a6a8-94884f06f9c3-service-ca-bundle\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.143979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-apiservice-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144003 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg5ng\" (UniqueName: \"kubernetes.io/projected/fccce0ee-16e1-4237-8081-a6a3c93c5851-kube-api-access-tg5ng\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144038 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1cfb10-4405-4ab9-8631-690622069d01-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144060 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-serving-cert\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144075 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lspkm\" (UniqueName: \"kubernetes.io/projected/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-kube-api-access-lspkm\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144095 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144115 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-serving-cert\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144134 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-webhook-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144150 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144165 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-config\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144180 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144195 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-srv-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144216 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144234 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-kube-api-access-bmpqt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144250 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp4c9\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-kube-api-access-qp4c9\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144265 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-config\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144298 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61f96497-68d8-4347-b831-f7bc0204c677-signing-cabundle\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144335 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22087b1b-3ded-441f-8349-fb8f38809460-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144351 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144369 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-metrics-certs\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144384 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24clt\" (UniqueName: \"kubernetes.io/projected/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-kube-api-access-24clt\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144400 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvv8z\" (UniqueName: \"kubernetes.io/projected/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-kube-api-access-gvv8z\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144417 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144451 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-cert\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144467 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-client\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144492 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-srv-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144510 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfxs\" (UniqueName: \"kubernetes.io/projected/bdfefc7f-6e59-460a-be36-220a37dd02d1-kube-api-access-xvfxs\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144546 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-service-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144563 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1cfb10-4405-4ab9-8631-690622069d01-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144578 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144596 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144611 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144627 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-certs\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144641 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/25067bcc-8503-442b-b348-87d7e1321dbd-tmpfs\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144695 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-node-bootstrap-token\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144716 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144733 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.144757 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.144950 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.644935414 +0000 UTC m=+35.553675346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.146562 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29766114-9e0b-4064-8010-8f426935f834-config\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.147165 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bdfefc7f-6e59-460a-be36-220a37dd02d1-config-volume\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.148724 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.152524 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.153453 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.154558 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-stats-auth\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.154741 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.153788 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.155340 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29766114-9e0b-4064-8010-8f426935f834-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.156439 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.159296 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f55572f9-fbba-4efa-a6a8-94884f06f9c3-service-ca-bundle\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.160318 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.166061 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.166138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-metrics-certs\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.167117 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f55572f9-fbba-4efa-a6a8-94884f06f9c3-default-certificate\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.167468 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.168563 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.169769 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.169893 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bdfefc7f-6e59-460a-be36-220a37dd02d1-metrics-tls\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.170023 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.172194 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.173977 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.188646 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.196329 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.206304 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-srv-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.212373 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249250 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-plugins-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249307 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1796695a-873c-4c15-9351-9b5bc5607830-proxy-tls\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22087b1b-3ded-441f-8349-fb8f38809460-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249378 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249406 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-mountpoint-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249576 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-plugins-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249682 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c868m\" (UniqueName: \"kubernetes.io/projected/e675e6aa-6d61-4490-b768-1dbe664d1dfe-kube-api-access-c868m\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249706 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92d3c944-8def-4f95-a3cb-781f929f5f28-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249785 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-socket-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249806 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249821 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-csi-data-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249877 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249920 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.249966 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dddnq\" (UniqueName: \"kubernetes.io/projected/8ac39f2f-2411-4585-b15c-c473b2fdc077-kube-api-access-dddnq\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250077 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72vzw\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-kube-api-access-72vzw\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250133 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd1cfb10-4405-4ab9-8631-690622069d01-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250154 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250195 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4fsv\" (UniqueName: \"kubernetes.io/projected/25067bcc-8503-442b-b348-87d7e1321dbd-kube-api-access-l4fsv\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250213 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssqc7\" (UniqueName: \"kubernetes.io/projected/61f96497-68d8-4347-b831-f7bc0204c677-kube-api-access-ssqc7\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250233 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwqhl\" (UniqueName: \"kubernetes.io/projected/1796695a-873c-4c15-9351-9b5bc5607830-kube-api-access-zwqhl\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250272 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xsv\" (UniqueName: \"kubernetes.io/projected/92d3c944-8def-4f95-a3cb-781f929f5f28-kube-api-access-t7xsv\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250291 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-images\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250321 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-apiservice-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250481 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-csi-data-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.250602 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.251148 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.751135753 +0000 UTC m=+35.659875685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251379 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg5ng\" (UniqueName: \"kubernetes.io/projected/fccce0ee-16e1-4237-8081-a6a3c93c5851-kube-api-access-tg5ng\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251519 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1cfb10-4405-4ab9-8631-690622069d01-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251534 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-socket-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251545 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-serving-cert\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.251668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-serving-cert\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.252822 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-mountpoint-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.252994 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s999\" (UniqueName: \"kubernetes.io/projected/61315eef-fa85-4828-9668-f6f4b1484453-kube-api-access-6s999\") pod \"openshift-apiserver-operator-796bbdcf4f-p6shd\" (UID: \"61315eef-fa85-4828-9668-f6f4b1484453\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258862 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-webhook-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258909 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lspkm\" (UniqueName: \"kubernetes.io/projected/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-kube-api-access-lspkm\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258932 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258949 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258964 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-srv-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.258980 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-config\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-kube-api-access-bmpqt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259042 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp4c9\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-kube-api-access-qp4c9\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259057 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259079 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22087b1b-3ded-441f-8349-fb8f38809460-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-config\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259115 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61f96497-68d8-4347-b831-f7bc0204c677-signing-cabundle\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259138 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259156 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24clt\" (UniqueName: \"kubernetes.io/projected/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-kube-api-access-24clt\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259175 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvv8z\" (UniqueName: \"kubernetes.io/projected/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-kube-api-access-gvv8z\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259192 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-cert\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259218 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-client\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259255 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-service-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259273 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1cfb10-4405-4ab9-8631-690622069d01-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259290 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259311 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259329 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-certs\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259346 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/25067bcc-8503-442b-b348-87d7e1321dbd-tmpfs\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259362 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-node-bootstrap-token\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259380 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259403 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259422 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c00abc0-dc46-406c-8f2f-6904ac88126d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259445 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259464 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-kube-api-access-5n4db\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259487 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-registration-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259507 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61f96497-68d8-4347-b831-f7bc0204c677-signing-key\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259524 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ac39f2f-2411-4585-b15c-c473b2fdc077-proxy-tls\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259539 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259554 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c00abc0-dc46-406c-8f2f-6904ac88126d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259570 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97vbj\" (UniqueName: \"kubernetes.io/projected/4e042627-4d69-4cc5-a00d-849fe4ce76f0-kube-api-access-97vbj\") pod \"migrator-59844c95c7-d7wmg\" (UID: \"4e042627-4d69-4cc5-a00d-849fe4ce76f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259588 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1796695a-873c-4c15-9351-9b5bc5607830-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259606 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5wb\" (UniqueName: \"kubernetes.io/projected/728ae7a4-9793-4555-abbb-b8a352700089-kube-api-access-lj5wb\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259627 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259644 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.259996 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.260625 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd1cfb10-4405-4ab9-8631-690622069d01-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.260787 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-config\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.260811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-auth-proxy-config\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.261075 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8ac39f2f-2411-4585-b15c-c473b2fdc077-images\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.264958 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-client\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.266220 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.266226 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/92d3c944-8def-4f95-a3cb-781f929f5f28-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.267390 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.268160 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-etcd-service-ca\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.268295 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22087b1b-3ded-441f-8349-fb8f38809460-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.268394 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e675e6aa-6d61-4490-b768-1dbe664d1dfe-registration-dir\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.269672 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/25067bcc-8503-442b-b348-87d7e1321dbd-tmpfs\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.270620 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.271554 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c00abc0-dc46-406c-8f2f-6904ac88126d-trusted-ca\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.273133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1796695a-873c-4c15-9351-9b5bc5607830-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.273536 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.276386 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.284294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-srv-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.287480 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-serving-cert\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.287847 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.289756 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5c00abc0-dc46-406c-8f2f-6904ac88126d-metrics-tls\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.290134 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8ac39f2f-2411-4585-b15c-c473b2fdc077-proxy-tls\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.290348 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.290795 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.291490 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.293964 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-webhook-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.297033 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd1cfb10-4405-4ab9-8631-690622069d01-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.297315 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1796695a-873c-4c15-9351-9b5bc5607830-proxy-tls\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.297801 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/25067bcc-8503-442b-b348-87d7e1321dbd-apiservice-cert\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.298022 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ac33402e-edb9-41ab-bb76-b17108b5ea0d-profile-collector-cert\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.301556 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.303117 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/fccce0ee-16e1-4237-8081-a6a3c93c5851-profile-collector-cert\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.307371 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.309641 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.317674 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-config\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.331454 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.359473 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.361216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.361736 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.861721904 +0000 UTC m=+35.770461836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.367702 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.368640 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.387997 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.404390 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-certs\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.409515 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.421051 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/61f96497-68d8-4347-b831-f7bc0204c677-signing-key\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.429705 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.447738 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.451997 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/61f96497-68d8-4347-b831-f7bc0204c677-signing-cabundle\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.464295 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.464674 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:57.96464192 +0000 UTC m=+35.873381862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.468019 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.489081 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.507736 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.514123 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/728ae7a4-9793-4555-abbb-b8a352700089-node-bootstrap-token\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.565543 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.566028 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.066006013 +0000 UTC m=+35.974745945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.567942 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.568891 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.569551 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.069536261 +0000 UTC m=+35.978276193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.573825 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.584850 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m254w\" (UniqueName: \"kubernetes.io/projected/0f685084-f748-4a34-9020-4d562f2a6d45-kube-api-access-m254w\") pod \"apiserver-76f77b778f-pm8dm\" (UID: \"0f685084-f748-4a34-9020-4d562f2a6d45\") " pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.589127 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.608523 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7tz\" (UniqueName: \"kubernetes.io/projected/9a655c79-a709-4d61-8209-200b86144e8b-kube-api-access-lz7tz\") pod \"dns-operator-744455d44c-zvq7p\" (UID: \"9a655c79-a709-4d61-8209-200b86144e8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.612164 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.628510 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.651098 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.660620 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.670039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.670214 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.670629 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.170609296 +0000 UTC m=+36.079349228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.672915 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h9ckd"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.678476 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-config\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.691917 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.696232 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.699376 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.709416 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.712356 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.723087 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-serving-cert\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.728430 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.748321 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.763203 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22087b1b-3ded-441f-8349-fb8f38809460-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.770809 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.772081 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.774953 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.274935021 +0000 UTC m=+36.183674943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.776505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.790243 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.807271 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.834871 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.843566 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerStarted","Data":"30cad834f566f85c0f3a6de4d149c40b4e51c114cf6d66d633ef1b6be4e13903"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.843775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-cert\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.847782 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.849581 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerStarted","Data":"1a03a4355bd12eae90e463960102d7b8d0f28a5a014b426c9235206feb008d3a"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.851795 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" event={"ID":"1a75d5de-a507-41ca-8206-eae702d16020","Type":"ContainerStarted","Data":"ea6b107f3c3026106a39c3caf82ba6fa45ff98f05ec346681fa1da42911087dc"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.856438 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" event={"ID":"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317","Type":"ContainerStarted","Data":"2d808d09bbc7f5cb2b76e9766c449e0d4ba9970ce27620ef9bfe40a6dd0a49ae"} Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.873074 4720 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.888900 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.907296 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:57 crc kubenswrapper[4720]: E0121 14:29:57.907880 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.407860609 +0000 UTC m=+36.316600541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.909204 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.954693 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:57 crc kubenswrapper[4720]: W0121 14:29:57.987391 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d WatchSource:0}: Error finding container 9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d: Status 404 returned error can't find the container with id 9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.988202 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fqt6\" (UniqueName: \"kubernetes.io/projected/d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb-kube-api-access-6fqt6\") pod \"kube-storage-version-migrator-operator-b67b599dd-gnrtf\" (UID: \"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:57 crc kubenswrapper[4720]: W0121 14:29:57.990087 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408 WatchSource:0}: Error finding container 7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408: Status 404 returned error can't find the container with id 7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408 Jan 21 14:29:57 crc kubenswrapper[4720]: I0121 14:29:57.994903 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q4rp\" (UniqueName: \"kubernetes.io/projected/ac33402e-edb9-41ab-bb76-b17108b5ea0d-kube-api-access-2q4rp\") pod \"catalog-operator-68c6474976-qkmbd\" (UID: \"ac33402e-edb9-41ab-bb76-b17108b5ea0d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.007792 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lhd59\" (UID: \"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.010210 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.010911 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.510893999 +0000 UTC m=+36.419633931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.011547 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-68kgl"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.027049 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-92xp4"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.061695 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wmxb9"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.064821 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28j2h\" (UniqueName: \"kubernetes.io/projected/f55572f9-fbba-4efa-a6a8-94884f06f9c3-kube-api-access-28j2h\") pod \"router-default-5444994796-5qcz5\" (UID: \"f55572f9-fbba-4efa-a6a8-94884f06f9c3\") " pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.070709 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29766114-9e0b-4064-8010-8f426935f834-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gjtkx\" (UID: \"29766114-9e0b-4064-8010-8f426935f834\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.076785 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfxs\" (UniqueName: \"kubernetes.io/projected/bdfefc7f-6e59-460a-be36-220a37dd02d1-kube-api-access-xvfxs\") pod \"dns-default-njjgs\" (UID: \"bdfefc7f-6e59-460a-be36-220a37dd02d1\") " pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.089792 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-njjgs" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.092202 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.116192 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.116845 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.616808419 +0000 UTC m=+36.525548351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.117629 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.117963 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.617947511 +0000 UTC m=+36.526687443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.118078 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.122686 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.137428 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.137833 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-v2pht"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.138644 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.144113 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.147461 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd1cfb10-4405-4ab9-8631-690622069d01-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-gxq5z\" (UID: \"fd1cfb10-4405-4ab9-8631-690622069d01\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.149754 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssqc7\" (UniqueName: \"kubernetes.io/projected/61f96497-68d8-4347-b831-f7bc0204c677-kube-api-access-ssqc7\") pod \"service-ca-9c57cc56f-7vkvw\" (UID: \"61f96497-68d8-4347-b831-f7bc0204c677\") " pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.161686 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.166057 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zvq7p"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.172975 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-pm8dm"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.174050 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.185982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.187960 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4fsv\" (UniqueName: \"kubernetes.io/projected/25067bcc-8503-442b-b348-87d7e1321dbd-kube-api-access-l4fsv\") pod \"packageserver-d55dfcdfc-86dvk\" (UID: \"25067bcc-8503-442b-b348-87d7e1321dbd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.195789 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c868m\" (UniqueName: \"kubernetes.io/projected/e675e6aa-6d61-4490-b768-1dbe664d1dfe-kube-api-access-c868m\") pod \"csi-hostpathplugin-j577t\" (UID: \"e675e6aa-6d61-4490-b768-1dbe664d1dfe\") " pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.204524 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwqhl\" (UniqueName: \"kubernetes.io/projected/1796695a-873c-4c15-9351-9b5bc5607830-kube-api-access-zwqhl\") pod \"machine-config-controller-84d6567774-zxv2h\" (UID: \"1796695a-873c-4c15-9351-9b5bc5607830\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.205381 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.219339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.219698 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.719671024 +0000 UTC m=+36.628410966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.222131 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.222579 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.722559955 +0000 UTC m=+36.631299887 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.223671 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.240797 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"marketplace-operator-79b997595-vxdw2\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.240996 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:29:58 crc kubenswrapper[4720]: W0121 14:29:58.241100 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a655c79_a709_4d61_8209_200b86144e8b.slice/crio-646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5 WatchSource:0}: Error finding container 646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5: Status 404 returned error can't find the container with id 646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5 Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.250671 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xsv\" (UniqueName: \"kubernetes.io/projected/92d3c944-8def-4f95-a3cb-781f929f5f28-kube-api-access-t7xsv\") pod \"multus-admission-controller-857f4d67dd-7mfnf\" (UID: \"92d3c944-8def-4f95-a3cb-781f929f5f28\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.260814 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.263218 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg5ng\" (UniqueName: \"kubernetes.io/projected/fccce0ee-16e1-4237-8081-a6a3c93c5851-kube-api-access-tg5ng\") pod \"olm-operator-6b444d44fb-5gg5l\" (UID: \"fccce0ee-16e1-4237-8081-a6a3c93c5851\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.268238 4720 request.go:700] Waited for 1.016211237s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Jan 21 14:29:58 crc kubenswrapper[4720]: W0121 14:29:58.271583 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f685084_f748_4a34_9020_4d562f2a6d45.slice/crio-fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce WatchSource:0}: Error finding container fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce: Status 404 returned error can't find the container with id fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.285969 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vzw\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-kube-api-access-72vzw\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.300840 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" Jan 21 14:29:58 crc kubenswrapper[4720]: W0121 14:29:58.306868 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55572f9_fbba_4efa_a6a8_94884f06f9c3.slice/crio-352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f WatchSource:0}: Error finding container 352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f: Status 404 returned error can't find the container with id 352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.322111 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.323080 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.323465 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.823441714 +0000 UTC m=+36.732181646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.323538 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.323821 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dddnq\" (UniqueName: \"kubernetes.io/projected/8ac39f2f-2411-4585-b15c-c473b2fdc077-kube-api-access-dddnq\") pod \"machine-config-operator-74547568cd-n66gl\" (UID: \"8ac39f2f-2411-4585-b15c-c473b2fdc077\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.324162 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.824150253 +0000 UTC m=+36.732890185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.344516 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22087b1b-3ded-441f-8349-fb8f38809460-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-swblx\" (UID: \"22087b1b-3ded-441f-8349-fb8f38809460\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.354100 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.361896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"cni-sysctl-allowlist-ds-nwj8k\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.362243 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-j577t" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.387863 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.393230 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lspkm\" (UniqueName: \"kubernetes.io/projected/03e0f458-ccd0-429e-ae37-d4c1fd2946bf-kube-api-access-lspkm\") pod \"package-server-manager-789f6589d5-8pm55\" (UID: \"03e0f458-ccd0-429e-ae37-d4c1fd2946bf\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.404023 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.416726 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"collect-profiles-29483415-hwxpp\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.426400 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.426807 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:58.926786022 +0000 UTC m=+36.835525954 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.430915 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp4c9\" (UniqueName: \"kubernetes.io/projected/5c00abc0-dc46-406c-8f2f-6904ac88126d-kube-api-access-qp4c9\") pod \"ingress-operator-5b745b69d9-rwv28\" (UID: \"5c00abc0-dc46-406c-8f2f-6904ac88126d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.459927 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.460371 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24clt\" (UniqueName: \"kubernetes.io/projected/a6e00143-8d6c-45fb-aa6c-44015c27a3f1-kube-api-access-24clt\") pod \"service-ca-operator-777779d784-s6zqk\" (UID: \"a6e00143-8d6c-45fb-aa6c-44015c27a3f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.468747 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvv8z\" (UniqueName: \"kubernetes.io/projected/c3f6d778-ef18-4ad7-bd13-fb7e5983af23-kube-api-access-gvv8z\") pod \"ingress-canary-n9vh6\" (UID: \"c3f6d778-ef18-4ad7-bd13-fb7e5983af23\") " pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.486374 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.493419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmpqt\" (UniqueName: \"kubernetes.io/projected/48af697e-308a-4bdd-a5d8-d86cd5c4fb0c-kube-api-access-bmpqt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jtj6g\" (UID: \"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.513510 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4db\" (UniqueName: \"kubernetes.io/projected/cc0cfb45-abdd-434f-ae63-f6ae0fc7c092-kube-api-access-5n4db\") pod \"etcd-operator-b45778765-lrm9f\" (UID: \"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092\") " pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.519777 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.525060 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97vbj\" (UniqueName: \"kubernetes.io/projected/4e042627-4d69-4cc5-a00d-849fe4ce76f0-kube-api-access-97vbj\") pod \"migrator-59844c95c7-d7wmg\" (UID: \"4e042627-4d69-4cc5-a00d-849fe4ce76f0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.527828 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.528750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.529037 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.029025499 +0000 UTC m=+36.937765431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.535313 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-njjgs"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.546930 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.554072 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.557205 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.560466 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5wb\" (UniqueName: \"kubernetes.io/projected/728ae7a4-9793-4555-abbb-b8a352700089-kube-api-access-lj5wb\") pod \"machine-config-server-tx54b\" (UID: \"728ae7a4-9793-4555-abbb-b8a352700089\") " pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.571313 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.588910 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.622465 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7vkvw"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.627534 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.630327 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.630445 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.130398052 +0000 UTC m=+37.039137984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.632482 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.634903 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.134888697 +0000 UTC m=+37.043628629 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.647080 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n9vh6" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.670895 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.757924 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.766205 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.771315 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.271288543 +0000 UTC m=+37.180028485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.805430 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-tx54b" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.819347 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk"] Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.876947 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:58 crc kubenswrapper[4720]: E0121 14:29:58.877376 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.377354137 +0000 UTC m=+37.286094069 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.908921 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" event={"ID":"07f01852-61b7-4eee-acd6-3d8b8e2b1c85","Type":"ContainerStarted","Data":"caea177b5f0adf7c5f6e8a02e65d3f9ae7d67ff099a93964e7ab412ce9e46e6c"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.932088 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" event={"ID":"afb1ffca-e30f-47cf-b399-2bd057039b10","Type":"ContainerStarted","Data":"730d32c8330d5c4334a6753d43cdbf6d8a2df14b78b053f47d3d07095fcee77d"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.967701 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerStarted","Data":"54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.968532 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.969721 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" event={"ID":"61315eef-fa85-4828-9668-f6f4b1484453","Type":"ContainerStarted","Data":"55f075296e8423f9114226e936856791cb02be1ccdcb20ed13ad16519691ab13"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.970914 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"911c154ea6ac5002a0cd6e707382e586c9634aa743c1b2f50d132f6faf6b5bf0"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.971671 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9d74757fc7f58cf1875d3f50e447fa64857a292eac2b4e0e9c52c2afee2bb99d"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.972088 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.972518 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" event={"ID":"9a655c79-a709-4d61-8209-200b86144e8b","Type":"ContainerStarted","Data":"646f87b40b0d5a35de4464510145dd02118820182cc72662a1458d7500c532a5"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.973076 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wmxb9" event={"ID":"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f","Type":"ContainerStarted","Data":"14ed6946c854a6a14009130861a7696037b95e5d52a4a6dd40a1adb4c9d59449"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.973529 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5qcz5" event={"ID":"f55572f9-fbba-4efa-a6a8-94884f06f9c3","Type":"ContainerStarted","Data":"352be9fbf51911f89bd19de761e00fd4ded5396187c7941f699a80e469b3f65f"} Jan 21 14:29:58 crc kubenswrapper[4720]: I0121 14:29:58.974129 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerStarted","Data":"28165debc992515a62bbac33db73e05a5347bebc002b160765e6c1b991bcf92e"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.013806 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.015066 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.515047379 +0000 UTC m=+37.423787311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.047319 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-68kgl" event={"ID":"aa2b643f-ce1f-45db-ba7f-31a5fc037650","Type":"ContainerStarted","Data":"3ed02c720c537b7e78dbc2a5f6f2d51e8ef65ed74b8145de14b35b2453cf5f8c"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.064045 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fa78d6afe2df95f13e50a466ef3667471e4037d61eb2a4603f8ac150650e44de"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.064086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7be625850e4e33be532147f3db97270396571f8fca40f191e7ed9c2f68ac3408"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.090980 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd"] Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.101841 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" event={"ID":"1a75d5de-a507-41ca-8206-eae702d16020","Type":"ContainerStarted","Data":"c7e2197c41007ce5863566b89eb987559f54f35ec9ebeedd08355a3bfb6357f3"} Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.117229 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.117552 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.617541173 +0000 UTC m=+37.526281105 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.218005 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.218221 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.718191455 +0000 UTC m=+37.626931397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.218516 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.218818 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.718810972 +0000 UTC m=+37.627550904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.230418 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h"] Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.288763 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59"] Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.319771 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.319914 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.819886028 +0000 UTC m=+37.728625960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.320049 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.320319 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.820312139 +0000 UTC m=+37.729052071 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.421257 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.421473 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.421898 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:29:59.921874937 +0000 UTC m=+37.830614869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.426835 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/139c8416-e015-49e4-adfe-32f9e142621f-metrics-certs\") pod \"network-metrics-daemon-x48m6\" (UID: \"139c8416-e015-49e4-adfe-32f9e142621f\") " pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.523062 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.523789 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.023764684 +0000 UTC m=+37.932504646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.533746 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-x48m6" Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.624020 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.624308 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.124292694 +0000 UTC m=+38.033032626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.725765 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.726223 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.226211592 +0000 UTC m=+38.134951524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.827128 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.827550 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.327524323 +0000 UTC m=+38.236264255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.928226 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:29:59 crc kubenswrapper[4720]: E0121 14:29:59.928476 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.428465854 +0000 UTC m=+38.337205786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.968677 4720 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7xcc8 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:29:59 crc kubenswrapper[4720]: I0121 14:29:59.968728 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.021779 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" podStartSLOduration=18.02176278 podStartE2EDuration="18.02176278s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:00.021414211 +0000 UTC m=+37.930154143" watchObservedRunningTime="2026-01-21 14:30:00.02176278 +0000 UTC m=+37.930502712" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.029339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.029712 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.529698873 +0000 UTC m=+38.438438805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.126584 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.130948 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.131708 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.132883 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.132909 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.132930 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.133011 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.133321 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.633302658 +0000 UTC m=+38.542042580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.143234 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.233585 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.234181 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.734157147 +0000 UTC m=+38.642897079 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234275 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234461 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.234535 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.234875 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.734862476 +0000 UTC m=+38.643602408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.235267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.237411 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.290397 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"collect-profiles-29483430-jcp9p\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.335582 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.335711 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.835690754 +0000 UTC m=+38.744430686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.335987 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.336257 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.836249989 +0000 UTC m=+38.744989921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.437026 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.437146 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.937128328 +0000 UTC m=+38.845868260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.437354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.437800 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:00.937782397 +0000 UTC m=+38.846522329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.446011 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.496971 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerStarted","Data":"fe964df5a682fc1d0ca756c59c958ece1dabea08ea21b2da6e7316f8e60564ce"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.534797 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.537635 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.537857 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.037843602 +0000 UTC m=+38.946583524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.546501 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerStarted","Data":"830f00cd4952a252732ae85fe73bd3c43f95902077b3e9a257094be91b79359d"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.549259 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" event={"ID":"90b6768c-8240-4fc1-a760-59d79a3c1c02","Type":"ContainerStarted","Data":"543f488bde8e56735d3445c774cd0398dc361a33f186699055133f8c7fa305d8"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.557097 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.561774 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" event={"ID":"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317","Type":"ContainerStarted","Data":"f899e54eba5bb0854d1b9456c7cf01b8a5a481e9e3238929b78818f3b24c5b44"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.586460 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njjgs" event={"ID":"bdfefc7f-6e59-460a-be36-220a37dd02d1","Type":"ContainerStarted","Data":"38c854ff6a882796bf5206f1b2a34d74cc62160e677f74ec8a029aa0fa14c832"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.613984 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" event={"ID":"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb","Type":"ContainerStarted","Data":"59913a4d0ff59fcd0eab8b556446aa66c2becc4ceb9ccb98fbf6b567d3915574"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.642321 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.642570 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.142559489 +0000 UTC m=+39.051299421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.745141 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.745353 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.245338662 +0000 UTC m=+39.154078594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.789231 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" event={"ID":"4a47e9b4-6318-4f71-9db0-105be2ada134","Type":"ContainerStarted","Data":"2d753d0c2455fc03b4f3a72d390e6307ba52fba190bd7b98c5a610d05b2c4ee5"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.798282 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" event={"ID":"aa4e660f-7816-4c20-b94c-5f9543d9cbed","Type":"ContainerStarted","Data":"985ce529eb1d976cd0ffd92077ff301852642b7b995a148d25e1219249de4740"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.799390 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"93344ee17cf14cce453ccb518314043a33398185bb204146ef9bdfb7b437dacc"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.801866 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerStarted","Data":"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071"} Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.804331 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.834786 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" podStartSLOduration=18.834772349 podStartE2EDuration="18.834772349s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:00.833718 +0000 UTC m=+38.742457942" watchObservedRunningTime="2026-01-21 14:30:00.834772349 +0000 UTC m=+38.743512281" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.845137 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.848568 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.849215 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.349195844 +0000 UTC m=+39.257935776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:00 crc kubenswrapper[4720]: I0121 14:30:00.955330 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:00 crc kubenswrapper[4720]: E0121 14:30:00.955573 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.455555767 +0000 UTC m=+39.364295699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.056849 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.057170 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.557158556 +0000 UTC m=+39.465898488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.157451 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.158118 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.658096496 +0000 UTC m=+39.566836428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.260326 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.260614 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.760602592 +0000 UTC m=+39.669342524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.271592 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.362762 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.362985 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.862957362 +0000 UTC m=+39.771697294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.363179 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.363887 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.863878947 +0000 UTC m=+39.772618879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.466616 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.466790 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.966761043 +0000 UTC m=+39.875500985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.468346 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.470545 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:01.970521928 +0000 UTC m=+39.879261860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.521311 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-j577t"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.524198 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7mfnf"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.569497 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.569645 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.069624087 +0000 UTC m=+39.978364019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.570001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.570347 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.070333437 +0000 UTC m=+39.979073369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.613583 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55"] Jan 21 14:30:01 crc kubenswrapper[4720]: W0121 14:30:01.662482 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1cfb10_4405_4ab9_8631_690622069d01.slice/crio-74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2 WatchSource:0}: Error finding container 74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2: Status 404 returned error can't find the container with id 74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2 Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.670501 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.670916 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.170898218 +0000 UTC m=+40.079638150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.773724 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.774075 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.27405913 +0000 UTC m=+40.182799062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: W0121 14:30:01.842859 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d3c944_8def_4f95_a3cb_781f929f5f28.slice/crio-97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4 WatchSource:0}: Error finding container 97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4: Status 404 returned error can't find the container with id 97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4 Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.874052 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl"] Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.875162 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.875410 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.375393082 +0000 UTC m=+40.284133014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.941311 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" event={"ID":"1796695a-873c-4c15-9351-9b5bc5607830","Type":"ContainerStarted","Data":"97e218b3b126bf9534123d7899f10f559637b9f620034ec27917fa080f36a0bd"} Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.969922 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" event={"ID":"aa4e660f-7816-4c20-b94c-5f9543d9cbed","Type":"ContainerStarted","Data":"8eb7645d76fc197f4b6ce9d91e9c31dfae161f32772e83b30ac5430eeabd4a4a"} Jan 21 14:30:01 crc kubenswrapper[4720]: I0121 14:30:01.979920 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:01 crc kubenswrapper[4720]: E0121 14:30:01.980197 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.480183971 +0000 UTC m=+40.388923903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.011821 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerStarted","Data":"617f70e18e4e0f9b72a22ff92ce1fc94aae99827e9d16ba9cde606ce5a9e499c"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.012465 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.041527 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-68kgl" event={"ID":"aa2b643f-ce1f-45db-ba7f-31a5fc037650","Type":"ContainerStarted","Data":"b1a4cab934777cf770ad7290079c5da558f7333410699ddf844c2a8c64c585b5"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.042475 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.063150 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" event={"ID":"ac33402e-edb9-41ab-bb76-b17108b5ea0d","Type":"ContainerStarted","Data":"ced6fef271fbd98e2b6b3396d759018065bba984c864f66fa3e3d7be0d03573a"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.080323 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.081599 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.581583154 +0000 UTC m=+40.490323086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.091423 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" event={"ID":"1a75d5de-a507-41ca-8206-eae702d16020","Type":"ContainerStarted","Data":"e5a5cec21014d0688ed0467706fadd3d5b85174d4bb148ee1c3623b6fe9f43a2"} Jan 21 14:30:02 crc kubenswrapper[4720]: W0121 14:30:02.094306 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac39f2f_2411_4585_b15c_c473b2fdc077.slice/crio-6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c WatchSource:0}: Error finding container 6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c: Status 404 returned error can't find the container with id 6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.118588 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" event={"ID":"07f01852-61b7-4eee-acd6-3d8b8e2b1c85","Type":"ContainerStarted","Data":"c0fbadd1246427f81683f1be63e303c0ac271a63273c1af4367d0f668557c8f5"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.142418 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" event={"ID":"9a655c79-a709-4d61-8209-200b86144e8b","Type":"ContainerStarted","Data":"4c60f97e493a2e76e74de04c15dfa162453b75bb780736505c177f53ef30ab92"} Jan 21 14:30:02 crc kubenswrapper[4720]: W0121 14:30:02.151231 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e042627_4d69_4cc5_a00d_849fe4ce76f0.slice/crio-ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857 WatchSource:0}: Error finding container ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857: Status 404 returned error can't find the container with id ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.155163 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" event={"ID":"61315eef-fa85-4828-9668-f6f4b1484453","Type":"ContainerStarted","Data":"4e25f17cddec9b580bbcb602930a790de59b6ea31c39e52a64dc1e8ad2d3faa2"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.157994 4720 generic.go:334] "Generic (PLEG): container finished" podID="afb1ffca-e30f-47cf-b399-2bd057039b10" containerID="abf435badffbc5ced9e3437c64cfcc64798e18b35727d62a3c9141855d35cd94" exitCode=0 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.158077 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" event={"ID":"afb1ffca-e30f-47cf-b399-2bd057039b10","Type":"ContainerDied","Data":"abf435badffbc5ced9e3437c64cfcc64798e18b35727d62a3c9141855d35cd94"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.201547 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.202107 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.702091114 +0000 UTC m=+40.610831046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.214167 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" event={"ID":"4a47e9b4-6318-4f71-9db0-105be2ada134","Type":"ContainerStarted","Data":"3ef40d035796e973c9c22661f757ce13326369cced21ba41ca9890b9313d2d47"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.232419 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" event={"ID":"90b6768c-8240-4fc1-a760-59d79a3c1c02","Type":"ContainerDied","Data":"6916309afb8a14b18f202b1fd06253cdb2b2c4bbbebd08448656d331945a6344"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.232581 4720 generic.go:334] "Generic (PLEG): container finished" podID="90b6768c-8240-4fc1-a760-59d79a3c1c02" containerID="6916309afb8a14b18f202b1fd06253cdb2b2c4bbbebd08448656d331945a6344" exitCode=0 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.237270 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" event={"ID":"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6","Type":"ContainerStarted","Data":"b3b04114286f227c680a1ea5cf6cd19487e347d3c26ab08256886d11a48843d9"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.277862 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-nscjl" podStartSLOduration=20.277838248 podStartE2EDuration="20.277838248s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.236414027 +0000 UTC m=+40.145153959" watchObservedRunningTime="2026-01-21 14:30:02.277838248 +0000 UTC m=+40.186578190" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.278580 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-68kgl" podStartSLOduration=21.278573699 podStartE2EDuration="21.278573699s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.19985318 +0000 UTC m=+40.108593122" watchObservedRunningTime="2026-01-21 14:30:02.278573699 +0000 UTC m=+40.187313631" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.303595 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerStarted","Data":"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.307804 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.308706 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.808690633 +0000 UTC m=+40.717430565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.326295 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" event={"ID":"b58aaa2f-44f4-4ab8-afdf-8b7f3c0e7317","Type":"ContainerStarted","Data":"6e84c6d90cb593eb086278fd37d11ee3fed453a2060bb4188e4ef4fb7a4d218f"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.341847 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-h9ckd" podStartSLOduration=20.341829252 podStartE2EDuration="20.341829252s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.333810467 +0000 UTC m=+40.242550409" watchObservedRunningTime="2026-01-21 14:30:02.341829252 +0000 UTC m=+40.250569184" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.390046 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.397723 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-p6shd" podStartSLOduration=20.397699989 podStartE2EDuration="20.397699989s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.387696039 +0000 UTC m=+40.296435981" watchObservedRunningTime="2026-01-21 14:30:02.397699989 +0000 UTC m=+40.306439921" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.408998 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.409387 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:02.909373146 +0000 UTC m=+40.818113078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.432591 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-lrm9f"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.456556 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.482219 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" event={"ID":"25067bcc-8503-442b-b348-87d7e1321dbd","Type":"ContainerStarted","Data":"fe3c5854542ee296733c003d876d747f7a26a5fd7fa183542d37f1a656b67cec"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.483207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.509583 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.511085 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.011065198 +0000 UTC m=+40.919805130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.524780 4720 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-86dvk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.527342 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" podUID="25067bcc-8503-442b-b348-87d7e1321dbd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.554049 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b43777fe6ea26bdc85574cfeba6c1859f9374dabe02fcf1d36c70c6f1335d99b"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.584389 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-92xp4" podStartSLOduration=20.584372104 podStartE2EDuration="20.584372104s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.539863705 +0000 UTC m=+40.448603677" watchObservedRunningTime="2026-01-21 14:30:02.584372104 +0000 UTC m=+40.493112036" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.585472 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" event={"ID":"03e0f458-ccd0-429e-ae37-d4c1fd2946bf","Type":"ContainerStarted","Data":"5a8050bcef0cf8fb732b68561f00f528026eaa34ede030eddcf4dcd5080b6027"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.603384 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" event={"ID":"29766114-9e0b-4064-8010-8f426935f834","Type":"ContainerStarted","Data":"6e6cbcd76291b87b152f21b35c336619a38f7f472b65b95b758fff01b973265d"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.604762 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" event={"ID":"61f96497-68d8-4347-b831-f7bc0204c677","Type":"ContainerStarted","Data":"5d848b34325d188a6dff5cc5277de22b8b084fa3ca1f2ceb5d3804f9cd4823f9"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.612630 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.613995 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.113977964 +0000 UTC m=+41.022717896 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.625935 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerStarted","Data":"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.626363 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.669003 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" event={"ID":"fd1cfb10-4405-4ab9-8631-690622069d01","Type":"ContainerStarted","Data":"74313acf4962ad44eef002ad9abf344b704e04361c558c8c43da2b2b4c2491b2"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.718104 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.718196 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.218178477 +0000 UTC m=+41.126918419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.718784 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.719114 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.219101762 +0000 UTC m=+41.127841694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.724301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tx54b" event={"ID":"728ae7a4-9793-4555-abbb-b8a352700089","Type":"ContainerStarted","Data":"635b5fa44bf914969c674a4d54bb1c50eb6e0f9d246af8687dcc8add89711c07"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.781945 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wmxb9" event={"ID":"120bd3b2-5437-4a15-bcc4-32ae06eb7f1f","Type":"ContainerStarted","Data":"c027252a8c92ff92166d093045af10483648801b002abfe63a08aa857ac45d75"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.781994 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.809801 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerStarted","Data":"606f33407deb43968f7cc7f66c83d922a2e45672a6ac0cad952ee6a566842321"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.811118 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.811179 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.812919 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-42g76" podStartSLOduration=21.812904812 podStartE2EDuration="21.812904812s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.794426604 +0000 UTC m=+40.703166546" watchObservedRunningTime="2026-01-21 14:30:02.812904812 +0000 UTC m=+40.721644744" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.813272 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cnk8x" podStartSLOduration=21.813265223 podStartE2EDuration="21.813265223s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:02.635061115 +0000 UTC m=+40.543801047" watchObservedRunningTime="2026-01-21 14:30:02.813265223 +0000 UTC m=+40.722005155" Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.827766 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.828189 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.328170221 +0000 UTC m=+41.236910163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.862737 4720 generic.go:334] "Generic (PLEG): container finished" podID="0f685084-f748-4a34-9020-4d562f2a6d45" containerID="2d7b0571d83db10d3caafeac10de4c427dd5874863779648edb433b1cbbca003" exitCode=0 Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.862828 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerDied","Data":"2d7b0571d83db10d3caafeac10de4c427dd5874863779648edb433b1cbbca003"} Jan 21 14:30:02 crc kubenswrapper[4720]: W0121 14:30:02.897397 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd390eca3_a064_441f_b469_3111e626bcae.slice/crio-20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb WatchSource:0}: Error finding container 20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb: Status 404 returned error can't find the container with id 20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.899010 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-5qcz5" event={"ID":"f55572f9-fbba-4efa-a6a8-94884f06f9c3","Type":"ContainerStarted","Data":"9e3871719184587eb6610bac36b8397f0e697272b42dcf037846d27b27051e59"} Jan 21 14:30:02 crc kubenswrapper[4720]: I0121 14:30:02.934294 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:02 crc kubenswrapper[4720]: E0121 14:30:02.935970 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.435956633 +0000 UTC m=+41.344696575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.000851 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"9fef56ce8894e77f9bfde39511b1715f349de7b85e40b799dde370407d27e676"} Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.066069 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.078223 4720 patch_prober.go:28] interesting pod/console-operator-58897d9998-68kgl container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.078282 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-68kgl" podUID="aa2b643f-ce1f-45db-ba7f-31a5fc037650" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.094937 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.113337 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.613310237 +0000 UTC m=+41.522050179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.126353 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.142452 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:03 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:03 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:03 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.142498 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.219737 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.221101 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.7210892 +0000 UTC m=+41.629829132 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.277567 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.325646 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.325920 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.825903049 +0000 UTC m=+41.734642981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.427635 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.428124 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:03.928110465 +0000 UTC m=+41.836850397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.535517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.535595 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.035578738 +0000 UTC m=+41.944318670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.536045 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.536311 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.036304159 +0000 UTC m=+41.945044091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.595100 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" podStartSLOduration=21.595078057 podStartE2EDuration="21.595078057s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.593807892 +0000 UTC m=+41.502547854" watchObservedRunningTime="2026-01-21 14:30:03.595078057 +0000 UTC m=+41.503817989" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.636846 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.637382 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.137366253 +0000 UTC m=+42.046106185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.693411 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" podStartSLOduration=21.693389154 podStartE2EDuration="21.693389154s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.68502557 +0000 UTC m=+41.593765502" watchObservedRunningTime="2026-01-21 14:30:03.693389154 +0000 UTC m=+41.602129086" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.739946 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.740197 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.240186317 +0000 UTC m=+42.148926249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.842278 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.842583 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.342563668 +0000 UTC m=+42.251303600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.958694 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:03 crc kubenswrapper[4720]: E0121 14:30:03.959191 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.459180518 +0000 UTC m=+42.367920450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.990342 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-tx54b" podStartSLOduration=8.990328041 podStartE2EDuration="8.990328041s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.988318705 +0000 UTC m=+41.897058637" watchObservedRunningTime="2026-01-21 14:30:03.990328041 +0000 UTC m=+41.899067973" Jan 21 14:30:03 crc kubenswrapper[4720]: I0121 14:30:03.991610 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wmxb9" podStartSLOduration=22.991592897 podStartE2EDuration="22.991592897s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:03.869402031 +0000 UTC m=+41.778141983" watchObservedRunningTime="2026-01-21 14:30:03.991592897 +0000 UTC m=+41.900332829" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.060761 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.061046 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.561030954 +0000 UTC m=+42.469770876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.069242 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-x48m6"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.082844 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.098124 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.122761 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.140073 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:04 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:04 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:04 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.140152 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.145199 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-5qcz5" podStartSLOduration=22.145182294 podStartE2EDuration="22.145182294s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.140956396 +0000 UTC m=+42.049696328" watchObservedRunningTime="2026-01-21 14:30:04.145182294 +0000 UTC m=+42.053922226" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.177905 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" event={"ID":"4e042627-4d69-4cc5-a00d-849fe4ce76f0","Type":"ContainerStarted","Data":"ecdf77c66e058e10226dc0eef1e41ffb5f42ce9d32b1c321d11e17c2b0482857"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.178411 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.178712 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.678700784 +0000 UTC m=+42.587440716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.182454 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n9vh6"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.279507 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.279929 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.779896252 +0000 UTC m=+42.688636184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.279979 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.280393 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.780385595 +0000 UTC m=+42.689125527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.338099 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.366303 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" event={"ID":"22087b1b-3ded-441f-8349-fb8f38809460","Type":"ContainerStarted","Data":"a4002e248d870a6134914a3582a580238184afda7f20d29a8c18d9b22c08bbd4"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.394105 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk"] Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.395301 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.395716 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.89569633 +0000 UTC m=+42.804436262 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.415998 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" event={"ID":"92d3c944-8def-4f95-a3cb-781f929f5f28","Type":"ContainerStarted","Data":"97fd1f677765ee21b274241e0156758e29a09c3cf56d15597c9447e41813dab4"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.469915 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" event={"ID":"61f96497-68d8-4347-b831-f7bc0204c677","Type":"ContainerStarted","Data":"1c66c0276821602c5edbec1cb152195002103444752fa96b7a64e6c92b173f44"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.495216 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" event={"ID":"03e0f458-ccd0-429e-ae37-d4c1fd2946bf","Type":"ContainerStarted","Data":"0a4b393749e8bb158083a687b0e28dc40131fb09d24ec98c35de9bfd981c5f0f"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.499289 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.499632 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:04.999612264 +0000 UTC m=+42.908352196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: W0121 14:30:04.523682 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e00143_8d6c_45fb_aa6c_44015c27a3f1.slice/crio-d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe WatchSource:0}: Error finding container d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe: Status 404 returned error can't find the container with id d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.537282 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7vkvw" podStartSLOduration=22.537260359 podStartE2EDuration="22.537260359s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.535302534 +0000 UTC m=+42.444042466" watchObservedRunningTime="2026-01-21 14:30:04.537260359 +0000 UTC m=+42.446000291" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.550149 4720 csr.go:261] certificate signing request csr-f6lkn is approved, waiting to be issued Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.583854 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njjgs" event={"ID":"bdfefc7f-6e59-460a-be36-220a37dd02d1","Type":"ContainerStarted","Data":"3a60f3401a971822a6c3c5e1c76e45f5b43f1881ba3b0903071ff46ed3267b0f"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.597044 4720 csr.go:257] certificate signing request csr-f6lkn is issued Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.601199 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.601343 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.101318636 +0000 UTC m=+43.010058568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.601372 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.601699 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.101687456 +0000 UTC m=+43.010427388 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.617095 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" event={"ID":"139e5d2d-e00c-4dc9-a6d9-2edc4a09e5e6","Type":"ContainerStarted","Data":"bf2039fd70760d288df96f1522264b3b31e14233f6babd676e9244f9ca4f09f1"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.631502 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" event={"ID":"29766114-9e0b-4064-8010-8f426935f834","Type":"ContainerStarted","Data":"dfefb22453d85c58abe579cd329abb20490bd945e049784272f03cd2584dfb08"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.647207 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" event={"ID":"ac33402e-edb9-41ab-bb76-b17108b5ea0d","Type":"ContainerStarted","Data":"46d5892660319f84ea1e8cd800ced58253dd8f8caf952b347dc69db7674d69c5"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.648374 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.666580 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.702420 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.702867 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.202845013 +0000 UTC m=+43.111584945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.703074 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.711554 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.211538966 +0000 UTC m=+43.120278898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.718818 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" event={"ID":"aa4e660f-7816-4c20-b94c-5f9543d9cbed","Type":"ContainerStarted","Data":"67f9df6a7e20f615514512c6c244bd455467fbc7e5a228f1cf92ec6c3864a1a5"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.732073 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" event={"ID":"d4105ff8-7f4b-4dc9-8ceb-3d136236e8fb","Type":"ContainerStarted","Data":"3d70726aeba065ed4519327507bc91405caf8af69dc543f70496090a17ba27cf"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.754511 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lhd59" podStartSLOduration=22.754481861 podStartE2EDuration="22.754481861s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.697247996 +0000 UTC m=+42.605987958" watchObservedRunningTime="2026-01-21 14:30:04.754481861 +0000 UTC m=+42.663221803" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.765582 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-tx54b" event={"ID":"728ae7a4-9793-4555-abbb-b8a352700089","Type":"ContainerStarted","Data":"eef27d30b0469b1e3c32696913cf322af70812c4540f276b721a15ab9fb08687"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.810906 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" event={"ID":"1796695a-873c-4c15-9351-9b5bc5607830","Type":"ContainerStarted","Data":"b911f58a7cbfdcc89c70d4e190476bc6c151db9ab249fad967fcfa77f4c4b7c8"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.814280 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.815338 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.315319637 +0000 UTC m=+43.224059579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.850693 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gjtkx" podStartSLOduration=22.850676989 podStartE2EDuration="22.850676989s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.848938849 +0000 UTC m=+42.757678781" watchObservedRunningTime="2026-01-21 14:30:04.850676989 +0000 UTC m=+42.759416931" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.880791 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerStarted","Data":"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.882417 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.882490 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.882514 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.925161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:04 crc kubenswrapper[4720]: E0121 14:30:04.926211 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.426201127 +0000 UTC m=+43.334941059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.955966 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" event={"ID":"25067bcc-8503-442b-b348-87d7e1321dbd","Type":"ContainerStarted","Data":"a86027c8a1d6f97abdd1ac248e6a6795566fdbfdc287504cfc91affcaede3c41"} Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.969941 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-mxvkb" podStartSLOduration=23.969921823 podStartE2EDuration="23.969921823s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.968064461 +0000 UTC m=+42.876804413" watchObservedRunningTime="2026-01-21 14:30:04.969921823 +0000 UTC m=+42.878661775" Jan 21 14:30:04 crc kubenswrapper[4720]: I0121 14:30:04.970317 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qkmbd" podStartSLOduration=22.970309693 podStartE2EDuration="22.970309693s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:04.930784055 +0000 UTC m=+42.839523997" watchObservedRunningTime="2026-01-21 14:30:04.970309693 +0000 UTC m=+42.879049625" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.004309 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.004360 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" event={"ID":"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092","Type":"ContainerStarted","Data":"e226ecb176783d4e83f6dbdf832c07112a9be1248aca29842d32bb46e309cafe"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.004505 4720 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.008951 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-gnrtf" podStartSLOduration=23.008932107 podStartE2EDuration="23.008932107s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.004706698 +0000 UTC m=+42.913446650" watchObservedRunningTime="2026-01-21 14:30:05.008932107 +0000 UTC m=+42.917672049" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.027488 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.028200 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.528183526 +0000 UTC m=+43.436923458 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.036433 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" event={"ID":"afb1ffca-e30f-47cf-b399-2bd057039b10","Type":"ContainerStarted","Data":"2887016d2bf00c16b4c00748b47e857869446bf465448174bb20e4e6131f872f"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.037293 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.044002 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" event={"ID":"8ac39f2f-2411-4585-b15c-c473b2fdc077","Type":"ContainerStarted","Data":"6c1742221c6fc2ea79651daddda07dafe086f710ffb076a28589d9c2168776a2"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.044040 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" event={"ID":"8ac39f2f-2411-4585-b15c-c473b2fdc077","Type":"ContainerStarted","Data":"6644131028ff0934bd5eebe9d63a3543b23a3cb63a40555fbbeef36e01c4d63c"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.047265 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" containerID="cri-o://a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" gracePeriod=30 Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.047538 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" event={"ID":"d390eca3-a064-441f-b469-3111e626bcae","Type":"ContainerStarted","Data":"20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb"} Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.048881 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.048948 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.063494 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-68kgl" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.081148 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" podStartSLOduration=24.081125531 podStartE2EDuration="24.081125531s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.078261791 +0000 UTC m=+42.987001733" watchObservedRunningTime="2026-01-21 14:30:05.081125531 +0000 UTC m=+42.989865463" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.081786 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podStartSLOduration=23.08178082 podStartE2EDuration="23.08178082s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.041330325 +0000 UTC m=+42.950070257" watchObservedRunningTime="2026-01-21 14:30:05.08178082 +0000 UTC m=+42.990520752" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.128519 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:05 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:05 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:05 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.128572 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.132722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.139798 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.639780866 +0000 UTC m=+43.548520798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.234134 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.234454 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.734439131 +0000 UTC m=+43.643179063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.271379 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.329126 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" podStartSLOduration=24.329108415 podStartE2EDuration="24.329108415s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:05.187239097 +0000 UTC m=+43.095979029" watchObservedRunningTime="2026-01-21 14:30:05.329108415 +0000 UTC m=+43.237848347" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.336367 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.336753 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.836738949 +0000 UTC m=+43.745478881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.437459 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.437861 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:05.937835364 +0000 UTC m=+43.846575296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.441857 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.442788 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.446012 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.538842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.538937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.538987 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.539044 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.539482 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.039464095 +0000 UTC m=+43.948204027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.600402 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-21 14:25:04 +0000 UTC, rotation deadline is 2026-10-20 09:57:01.438317231 +0000 UTC Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.600645 4720 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6523h26m55.837674681s for next certificate rotation Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.634966 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.636079 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.637890 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640388 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640555 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640587 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.640668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.641299 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.641385 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.141367592 +0000 UTC m=+44.050107534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.641607 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.661991 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.679956 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.725909 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"certified-operators-v6vwc\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741418 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741463 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741508 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.741533 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.742016 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.242005504 +0000 UTC m=+44.150745436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842048 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842685 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842744 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.842804 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.843411 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.843487 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.34347082 +0000 UTC m=+44.252210752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.843742 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.857608 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.858583 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.869430 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.876240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"community-operators-5qbdf\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.897812 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947628 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947689 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.947711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:05 crc kubenswrapper[4720]: E0121 14:30:05.948167 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.448148456 +0000 UTC m=+44.356888458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.961401 4720 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-86dvk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:05 crc kubenswrapper[4720]: I0121 14:30:05.961478 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" podUID="25067bcc-8503-442b-b348-87d7e1321dbd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048549 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048825 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048857 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.048905 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.049280 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.54923966 +0000 UTC m=+44.457979592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.049869 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.049987 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.050172 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.050881 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.094844 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" event={"ID":"22087b1b-3ded-441f-8349-fb8f38809460","Type":"ContainerStarted","Data":"c8eceb60572f6162c0c955f8aad2dcef86faf6e1f36706504931972ad6124fd0"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.096794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x48m6" event={"ID":"139c8416-e015-49e4-adfe-32f9e142621f","Type":"ContainerStarted","Data":"968913338eb8518c6dbbe73e98e64885086a293721a8f30f0b13f8c4d3aba2de"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.097788 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerStarted","Data":"651c91098a6b5beb1bb69833f5373a6ae3cd82dd60030a87f4a9c3ad1187b846"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.117124 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"certified-operators-lt46m\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.119444 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerStarted","Data":"316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.119911 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.128870 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n9vh6" event={"ID":"c3f6d778-ef18-4ad7-bd13-fb7e5983af23","Type":"ContainerStarted","Data":"fecb40ecb9afe912623a7175e16d918f02eabe55a49dcc1e0427eedc76b4af1b"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.133189 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:06 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:06 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:06 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.133255 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.151596 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-swblx" podStartSLOduration=24.15157865 podStartE2EDuration="24.15157865s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.15155972 +0000 UTC m=+44.060299672" watchObservedRunningTime="2026-01-21 14:30:06.15157865 +0000 UTC m=+44.060318582" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152083 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152100 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" event={"ID":"9a655c79-a709-4d61-8209-200b86144e8b","Type":"ContainerStarted","Data":"49b6cad3737585c15ad5079706732ff2c45073b70a5e605429f16aca057087d7"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152170 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152196 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.152343 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.152705 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.652692782 +0000 UTC m=+44.561432714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.176448 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" event={"ID":"03e0f458-ccd0-429e-ae37-d4c1fd2946bf","Type":"ContainerStarted","Data":"c768102736fae3e06a7547588d93ac80d31bd9677eec427f354190b508c4fbb2"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.176982 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.190674 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.197119 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" event={"ID":"fd1cfb10-4405-4ab9-8631-690622069d01","Type":"ContainerStarted","Data":"4e8987d42ba4c5529c8b73b93b8fc3182fe8607ef6e1da8e5fd63d9195c2fd46"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.204257 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" event={"ID":"1796695a-873c-4c15-9351-9b5bc5607830","Type":"ContainerStarted","Data":"d997dc3690941e9c3ebc8356d68a0504b34573122bc3142c2aecfa544b68ad28"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.211557 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" event={"ID":"90b6768c-8240-4fc1-a760-59d79a3c1c02","Type":"ContainerStarted","Data":"67b6377e3711859f3193aa79202ee17c2f2c10ee776aabe85bf9c31512f966ad"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.214898 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.229667 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" event={"ID":"5c00abc0-dc46-406c-8f2f-6904ac88126d","Type":"ContainerStarted","Data":"54aaf0431c1fc000e77bbdd850b6650c74c7fd8a9e5e80188424d5938f4ecefe"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.234011 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" event={"ID":"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c","Type":"ContainerStarted","Data":"98c98fbadf7f3e7f292782d1cddb7912face0c2f029cfd169f3551f42aaf30d1"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.235184 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_collect-profiles-29483415-hwxpp_d390eca3-a064-441f-b469-3111e626bcae/collect-profiles/0.log" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.235276 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.243981 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podStartSLOduration=11.243963111 podStartE2EDuration="11.243963111s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.24287589 +0000 UTC m=+44.151615822" watchObservedRunningTime="2026-01-21 14:30:06.243963111 +0000 UTC m=+44.152703043" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.244306 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.249544 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" event={"ID":"fccce0ee-16e1-4237-8081-a6a3c93c5851","Type":"ContainerStarted","Data":"6953a2573da38f7bc8140ee278e90e6b12d78a56e39a5f7fde5cb7802248bc99"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253161 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253234 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") pod \"d390eca3-a064-441f-b469-3111e626bcae\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.253283 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.753257901 +0000 UTC m=+44.661997833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253306 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") pod \"d390eca3-a064-441f-b469-3111e626bcae\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.253364 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") pod \"d390eca3-a064-441f-b469-3111e626bcae\" (UID: \"d390eca3-a064-441f-b469-3111e626bcae\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.255233 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume" (OuterVolumeSpecName: "config-volume") pod "d390eca3-a064-441f-b469-3111e626bcae" (UID: "d390eca3-a064-441f-b469-3111e626bcae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.255440 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.255792 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.755778542 +0000 UTC m=+44.664518474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.256854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257344 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257399 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257457 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257481 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d390eca3-a064-441f-b469-3111e626bcae-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.257811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296863 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_collect-profiles-29483415-hwxpp_d390eca3-a064-441f-b469-3111e626bcae/collect-profiles/0.log" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296904 4720 generic.go:334] "Generic (PLEG): container finished" podID="d390eca3-a064-441f-b469-3111e626bcae" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" exitCode=2 Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296967 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" event={"ID":"d390eca3-a064-441f-b469-3111e626bcae","Type":"ContainerDied","Data":"20bb4a1f28ddb483420180b50ee58dffb3342280cd3e1eb7cd43cf194a8782bb"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.296992 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" event={"ID":"d390eca3-a064-441f-b469-3111e626bcae","Type":"ContainerDied","Data":"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.297017 4720 scope.go:117] "RemoveContainer" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.297109 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.299519 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"community-operators-fwhvj\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.313066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" event={"ID":"4e042627-4d69-4cc5-a00d-849fe4ce76f0","Type":"ContainerStarted","Data":"c6cff6b05c13f3c17654a3bfac26f39af51d795e43510a703262697f21a80cac"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.324286 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d390eca3-a064-441f-b469-3111e626bcae" (UID: "d390eca3-a064-441f-b469-3111e626bcae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.328778 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" event={"ID":"92d3c944-8def-4f95-a3cb-781f929f5f28","Type":"ContainerStarted","Data":"e000f9d0c7d509713368b4e850e9619d534255a1c2acd7e5f742ea3f09048459"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.330907 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.336613 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc" (OuterVolumeSpecName: "kube-api-access-4dqdc") pod "d390eca3-a064-441f-b469-3111e626bcae" (UID: "d390eca3-a064-441f-b469-3111e626bcae"). InnerVolumeSpecName "kube-api-access-4dqdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.356298 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" event={"ID":"a6e00143-8d6c-45fb-aa6c-44015c27a3f1","Type":"ContainerStarted","Data":"d9c7637e04f8708eda047091f131189948f6f5c5b271d1b24e156faa2ee2aebe"} Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.361560 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.361921 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d390eca3-a064-441f-b469-3111e626bcae-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.361935 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dqdc\" (UniqueName: \"kubernetes.io/projected/d390eca3-a064-441f-b469-3111e626bcae-kube-api-access-4dqdc\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.362341 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.86232179 +0000 UTC m=+44.771061732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.371839 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.371901 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.373254 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-zxv2h" podStartSLOduration=24.373243866 podStartE2EDuration="24.373243866s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.285373982 +0000 UTC m=+44.194113924" watchObservedRunningTime="2026-01-21 14:30:06.373243866 +0000 UTC m=+44.281983798" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.373835 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-86dvk" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.470196 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.476900 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:06.976879452 +0000 UTC m=+44.885619384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.486920 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.545691 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-gxq5z" podStartSLOduration=24.545673561 podStartE2EDuration="24.545673561s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.545147747 +0000 UTC m=+44.453887699" watchObservedRunningTime="2026-01-21 14:30:06.545673561 +0000 UTC m=+44.454413493" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.547704 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zvq7p" podStartSLOduration=24.547695449 podStartE2EDuration="24.547695449s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.469083494 +0000 UTC m=+44.377823446" watchObservedRunningTime="2026-01-21 14:30:06.547695449 +0000 UTC m=+44.456435401" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.579236 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.579392 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.079375566 +0000 UTC m=+44.988115498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.579475 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.580116 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.080109738 +0000 UTC m=+44.988849670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.584161 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" podStartSLOduration=24.584146791 podStartE2EDuration="24.584146791s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.582904676 +0000 UTC m=+44.491644608" watchObservedRunningTime="2026-01-21 14:30:06.584146791 +0000 UTC m=+44.492886723" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.657352 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" podStartSLOduration=24.657335453 podStartE2EDuration="24.657335453s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.655081069 +0000 UTC m=+44.563821031" watchObservedRunningTime="2026-01-21 14:30:06.657335453 +0000 UTC m=+44.566075395" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.681397 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.681758 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.181738987 +0000 UTC m=+45.090478929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.718901 4720 scope.go:117] "RemoveContainer" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.723912 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409\": container with ID starting with a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409 not found: ID does not exist" containerID="a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.723960 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409"} err="failed to get container status \"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409\": rpc error: code = NotFound desc = could not find container \"a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409\": container with ID starting with a8aefdfc6a0ce58c176fdd5bc916c454bc92f7fb8066e8d2403bc8df6646d409 not found: ID does not exist" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.735981 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" podStartSLOduration=24.735963748 podStartE2EDuration="24.735963748s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:06.732647145 +0000 UTC m=+44.641387087" watchObservedRunningTime="2026-01-21 14:30:06.735963748 +0000 UTC m=+44.644703680" Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.782522 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.785969 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.28594213 +0000 UTC m=+45.194682062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.832503 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.847010 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483415-hwxpp"] Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.885211 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.885602 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.385583154 +0000 UTC m=+45.294323086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:06 crc kubenswrapper[4720]: I0121 14:30:06.988741 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:06 crc kubenswrapper[4720]: E0121 14:30:06.989312 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.489299712 +0000 UTC m=+45.398039644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.089817 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.090466 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.590446649 +0000 UTC m=+45.499186581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.108468 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.123329 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.123601 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.131212 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:07 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:07 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:07 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.131273 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.172323 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.172384 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176281 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176339 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176364 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.176450 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.191208 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.193125 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.693110658 +0000 UTC m=+45.601850590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.193515 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.193548 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.226862 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.295306 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.296048 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.796027574 +0000 UTC m=+45.704767506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.296098 4720 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-v2pht container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.296130 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" podUID="afb1ffca-e30f-47cf-b399-2bd057039b10" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.326816 4720 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-v2pht container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.326885 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" podUID="afb1ffca-e30f-47cf-b399-2bd057039b10" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.382384 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-v2pht" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.407684 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.407965 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:07.907952683 +0000 UTC m=+45.816692615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.434668 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" event={"ID":"48af697e-308a-4bdd-a5d8-d86cd5c4fb0c","Type":"ContainerStarted","Data":"519f47995dd784bb4cf274ea6fe463ec4e8006ef7435bade22b62727a5868bc1"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.469251 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" event={"ID":"fccce0ee-16e1-4237-8081-a6a3c93c5851","Type":"ContainerStarted","Data":"67810c6149f6a6e59bc9eb73bbd632c8fe0e8a6d31d812492283062b75fbb016"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.470230 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.478820 4720 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-5gg5l container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.478887 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" podUID="fccce0ee-16e1-4237-8081-a6a3c93c5851" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.498914 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerStarted","Data":"24914b76a0e5210499019f7f0b2d263f162c0daf747c7bb929ce8a0cf24ad2a4"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.508344 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.509512 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.009492931 +0000 UTC m=+45.918232873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.543312 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" event={"ID":"a6e00143-8d6c-45fb-aa6c-44015c27a3f1","Type":"ContainerStarted","Data":"0434a9b612c70d163cdcd2c8e0ae7449dc94c05a57dde70fbc11296020e73ce4"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.556910 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" event={"ID":"5c00abc0-dc46-406c-8f2f-6904ac88126d","Type":"ContainerStarted","Data":"a5faac90817005d0ec9a2561734998e41366a786d383a006ee579e39b73b4941"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.577387 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerStarted","Data":"d3b5cdbc839bad4c3029ff33f78cd38f5b5e460e9963f6c280d92ade619bd510"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.579300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" event={"ID":"cc0cfb45-abdd-434f-ae63-f6ae0fc7c092","Type":"ContainerStarted","Data":"8e689c54f4aa279c50cb943b25e6e08fb2eab35cf67ed899457c81521c71aeb4"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.609473 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.610376 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.110363669 +0000 UTC m=+46.019103601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.621164 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jtj6g" podStartSLOduration=25.621139461 podStartE2EDuration="25.621139461s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:07.619094434 +0000 UTC m=+45.527834376" watchObservedRunningTime="2026-01-21 14:30:07.621139461 +0000 UTC m=+45.529879393" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.624279 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerStarted","Data":"e88152461de95b2d3879730fa006e0e5249df7579f5958336bdfc79b7ce596e2"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.626546 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-n66gl" event={"ID":"8ac39f2f-2411-4585-b15c-c473b2fdc077","Type":"ContainerStarted","Data":"2a1760c0b0a1875048875fd14c37d6dbdd8f6c426ab4471ed1f5205716974b83"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.717128 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.718823 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.21879777 +0000 UTC m=+46.127537712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.734642 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" event={"ID":"4e042627-4d69-4cc5-a00d-849fe4ce76f0","Type":"ContainerStarted","Data":"de1edc092793e7f1ba423fcf05a0d8e01201368f76f99b2ad5779465e9d8d83e"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.744083 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-njjgs" event={"ID":"bdfefc7f-6e59-460a-be36-220a37dd02d1","Type":"ContainerStarted","Data":"bfe30717a15d15ceb38ed46686648690145c7fb299f2f73f2c74a6bff0ccaf08"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.744794 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-njjgs" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.758431 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"df67309d4e16017900e2964193f307e14888f80c9c30ab9c6f7e58639fc93acf"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.762608 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x48m6" event={"ID":"139c8416-e015-49e4-adfe-32f9e142621f","Type":"ContainerStarted","Data":"03c5d5d841f49c909c3e704a31faedc725965bd1e58ee925978fc2f5004161ab"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.770345 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" event={"ID":"92d3c944-8def-4f95-a3cb-781f929f5f28","Type":"ContainerStarted","Data":"5eefcb917305426fe1b4c817f0e44feef59c159d7b3860fd5af7b252d9ed7f26"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.782965 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n9vh6" event={"ID":"c3f6d778-ef18-4ad7-bd13-fb7e5983af23","Type":"ContainerStarted","Data":"99f28ed1d7f47d2c232a8199808cc278dd0935fb030dd791cb25673d70d9cd6a"} Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.819290 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.819677 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.876707 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" podStartSLOduration=25.876682248 podStartE2EDuration="25.876682248s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:07.833308211 +0000 UTC m=+45.742048173" watchObservedRunningTime="2026-01-21 14:30:07.876682248 +0000 UTC m=+45.785422190" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.878493 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.885182 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.893232 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.898566 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.398549201 +0000 UTC m=+46.307289123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.986849 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-d7wmg" podStartSLOduration=25.986830087 podStartE2EDuration="25.986830087s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:07.985326714 +0000 UTC m=+45.894066666" watchObservedRunningTime="2026-01-21 14:30:07.986830087 +0000 UTC m=+45.895570029" Jan 21 14:30:07 crc kubenswrapper[4720]: I0121 14:30:07.995223 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:07 crc kubenswrapper[4720]: E0121 14:30:07.996079 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.496048175 +0000 UTC m=+46.404788107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.028631 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.028884 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.028899 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.029015 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d390eca3-a064-441f-b469-3111e626bcae" containerName="collect-profiles" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.029960 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.031898 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.063607 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.097585 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102514 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102576 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102598 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.102670 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.103001 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.602986274 +0000 UTC m=+46.511726206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.120756 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.131181 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:08 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:08 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:08 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.131550 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.175425 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-njjgs" podStartSLOduration=13.175409965 podStartE2EDuration="13.175409965s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.132143751 +0000 UTC m=+46.040883693" watchObservedRunningTime="2026-01-21 14:30:08.175409965 +0000 UTC m=+46.084149897" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.176527 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-lrm9f" podStartSLOduration=26.176522096 podStartE2EDuration="26.176522096s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.17344785 +0000 UTC m=+46.082187792" watchObservedRunningTime="2026-01-21 14:30:08.176522096 +0000 UTC m=+46.085262028" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.178959 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.210853 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.211119 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.711100976 +0000 UTC m=+46.619840908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.211185 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.211210 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.211234 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.218569 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.218827 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.234189 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7mfnf" podStartSLOduration=26.234176263 podStartE2EDuration="26.234176263s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.233102902 +0000 UTC m=+46.141842844" watchObservedRunningTime="2026-01-21 14:30:08.234176263 +0000 UTC m=+46.142916185" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.253799 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.254995 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.275717 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" podStartSLOduration=8.275705378 podStartE2EDuration="8.275705378s" podCreationTimestamp="2026-01-21 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.274939476 +0000 UTC m=+46.183679408" watchObservedRunningTime="2026-01-21 14:30:08.275705378 +0000 UTC m=+46.184445310" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.287250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"redhat-marketplace-c95rn\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.288478 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.312354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.313860 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.813848567 +0000 UTC m=+46.722588499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.354899 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.355225 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.355474 4720 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vxdw2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" start-of-body= Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.355604 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.30:8080/healthz\": dial tcp 10.217.0.30:8080: connect: connection refused" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.367048 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.407201 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-s6zqk" podStartSLOduration=26.407183495 podStartE2EDuration="26.407183495s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.331491832 +0000 UTC m=+46.240231764" watchObservedRunningTime="2026-01-21 14:30:08.407183495 +0000 UTC m=+46.315923427" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.407755 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n9vh6" podStartSLOduration=13.40775017 podStartE2EDuration="13.40775017s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.406032262 +0000 UTC m=+46.314772204" watchObservedRunningTime="2026-01-21 14:30:08.40775017 +0000 UTC m=+46.316490102" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414259 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414426 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414467 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.414508 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.414641 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:08.914621503 +0000 UTC m=+46.823361435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.502412 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-5gg5l" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520302 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520357 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520393 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520431 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.520850 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.020821611 +0000 UTC m=+46.929561543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.520866 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.521156 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.551450 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"redhat-marketplace-jbtfr\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.622133 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.622435 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.122419781 +0000 UTC m=+47.031159713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.638055 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.641473 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.675530 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.686361 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.717841 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d390eca3-a064-441f-b469-3111e626bcae" path="/var/lib/kubelet/pods/d390eca3-a064-441f-b469-3111e626bcae/volumes" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.718417 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.723536 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.723874 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.223862925 +0000 UTC m=+47.132602857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.827796 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.828158 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.828216 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.828237 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.829784 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.329767996 +0000 UTC m=+47.238507918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.831056 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" event={"ID":"5c00abc0-dc46-406c-8f2f-6904ac88126d","Type":"ContainerStarted","Data":"ea3e42f59558ac33a0401b68f0727a6c36ce0105fe3235e649f0da0ed75102ed"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.833391 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerStarted","Data":"328b3e95ade1caeae4e693dd7d243f33f61953dabc84aa7d096915ec1cb9417f"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.845299 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.846410 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.854724 4720 generic.go:334] "Generic (PLEG): container finished" podID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" exitCode=0 Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.855499 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.864270 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.872225 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.884418 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rwv28" podStartSLOduration=26.884399938 podStartE2EDuration="26.884399938s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:08.884075578 +0000 UTC m=+46.792815520" watchObservedRunningTime="2026-01-21 14:30:08.884399938 +0000 UTC m=+46.793139870" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.893629 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerStarted","Data":"85ca11cc33d09ce2c8fd7bab9c3118f3fb41bcc9c4f1e36c585b8c6b04ce1492"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.911034 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-x48m6" event={"ID":"139c8416-e015-49e4-adfe-32f9e142621f","Type":"ContainerStarted","Data":"f20d09863be7ca61fa7342b49aa7455a47ce4ac93bdba3ee521a02fa9dc39c25"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934193 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934228 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.934302 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.935195 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.935401 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:08 crc kubenswrapper[4720]: E0121 14:30:08.935561 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.435546911 +0000 UTC m=+47.344286843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.936231 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" event={"ID":"0f685084-f748-4a34-9020-4d562f2a6d45","Type":"ContainerStarted","Data":"604be98ef64db9bf2dee8b6519ee20ed2b69e3f5461179f4162ff99dea72456c"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.959324 4720 generic.go:334] "Generic (PLEG): container finished" podID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerID="fea576e42ea53daf64f9e355cf2971b7c48351b927096e3397ea48c46de4d07f" exitCode=0 Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.960272 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"fea576e42ea53daf64f9e355cf2971b7c48351b927096e3397ea48c46de4d07f"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.960300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerStarted","Data":"f92665f685bf80e17e3e48269da656cf92cd51a8b00c063a085e7a0052993aa3"} Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.960388 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" gracePeriod=30 Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.972211 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-42snh" Jan 21 14:30:08 crc kubenswrapper[4720]: I0121 14:30:08.988573 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"redhat-operators-x7575\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.036471 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" podStartSLOduration=27.036453771 podStartE2EDuration="27.036453771s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:09.034195818 +0000 UTC m=+46.942935750" watchObservedRunningTime="2026-01-21 14:30:09.036453771 +0000 UTC m=+46.945193703" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.037709 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.037957 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.038051 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.038131 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.038218 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.53820665 +0000 UTC m=+47.446946582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.040116 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.090716 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.133895 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:09 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:09 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:09 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.133950 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141370 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141509 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141590 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.141710 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.146424 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.148055 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.648043071 +0000 UTC m=+47.556783003 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.150928 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.188557 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-x48m6" podStartSLOduration=28.188540417 podStartE2EDuration="28.188540417s" podCreationTimestamp="2026-01-21 14:29:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:09.185147841 +0000 UTC m=+47.093887763" watchObservedRunningTime="2026-01-21 14:30:09.188540417 +0000 UTC m=+47.097280349" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.196955 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"redhat-operators-52n8k\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.280584 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.281921 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.781895815 +0000 UTC m=+47.690635747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.294372 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.387027 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.387314 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.8873025 +0000 UTC m=+47.796042432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.474341 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.488173 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.488461 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:09.988443487 +0000 UTC m=+47.897183419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.588771 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.589314 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.089297275 +0000 UTC m=+47.998037207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.689428 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.689643 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.189617688 +0000 UTC m=+48.098357610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.690183 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.690531 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.190515903 +0000 UTC m=+48.099255845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.790679 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.790910 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.290887648 +0000 UTC m=+48.199627580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.893958 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.894472 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.394454433 +0000 UTC m=+48.303194365 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.983922 4720 generic.go:334] "Generic (PLEG): container finished" podID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" exitCode=0 Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.984187 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a"} Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.984242 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerStarted","Data":"79477e1af1d10f20ff2bf7e280a7ee476108ea069779af5f5cdc35424364da3b"} Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.993564 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerID="14e886daf1a3a6b869ffcf74d313a6df0c2abaf901b1048767f8b1caf48b8b35" exitCode=0 Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.993683 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"14e886daf1a3a6b869ffcf74d313a6df0c2abaf901b1048767f8b1caf48b8b35"} Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.994868 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.995017 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.494994782 +0000 UTC m=+48.403734714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:09 crc kubenswrapper[4720]: I0121 14:30:09.995212 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:09 crc kubenswrapper[4720]: E0121 14:30:09.995534 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.495521347 +0000 UTC m=+48.404261279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.005414 4720 generic.go:334] "Generic (PLEG): container finished" podID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" exitCode=0 Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.005497 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.010936 4720 generic.go:334] "Generic (PLEG): container finished" podID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" exitCode=0 Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.011021 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.011056 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerStarted","Data":"9c2892b80c1d95c871202545822430a42e2c2316e71ccc122df3bcadd593a956"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.017982 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"0a63c057137dd1dce00640395cbccb63187990cb6f7fcaffbda1530cf924ee49"} Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.098083 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.100235 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.600217723 +0000 UTC m=+48.508957655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.137938 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:10 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:10 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:10 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.138002 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.202182 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.202442 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.702431389 +0000 UTC m=+48.611171321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.303414 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.303513 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.803498183 +0000 UTC m=+48.712238115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.303803 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.304092 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.80408448 +0000 UTC m=+48.712824402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.405106 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.405298 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.905271138 +0000 UTC m=+48.814011070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.405365 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.405677 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:10.905666658 +0000 UTC m=+48.814406590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.479079 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.506765 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.006742103 +0000 UTC m=+48.915482035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.506807 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.506898 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.507233 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.007224247 +0000 UTC m=+48.915964179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.591259 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.607922 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.609679 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.109634639 +0000 UTC m=+49.018374571 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: W0121 14:30:10.621397 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod328ecaa4_59eb_4707_a320_245636d0c778.slice/crio-b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d WatchSource:0}: Error finding container b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d: Status 404 returned error can't find the container with id b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.717820 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.718255 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.218231354 +0000 UTC m=+49.126971286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.838543 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.838685 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.338666751 +0000 UTC m=+49.247406683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.838791 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.839089 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.339081373 +0000 UTC m=+49.247821305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:10 crc kubenswrapper[4720]: I0121 14:30:10.940119 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:10 crc kubenswrapper[4720]: E0121 14:30:10.940640 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.44062282 +0000 UTC m=+49.349362752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.033879 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475"} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.034295 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"26891b408ccd24b0c8434d044528c04f82c156ee44333c5cc05cf38ad2ef94ce"} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.041498 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.041967 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.541945752 +0000 UTC m=+49.450685684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.044057 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerStarted","Data":"b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d"} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.122990 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:11 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:11 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:11 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.123272 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.142513 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.142693 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.642666777 +0000 UTC m=+49.551406709 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.142935 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.143250 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.643237103 +0000 UTC m=+49.551977035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.244962 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.245126 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.745097159 +0000 UTC m=+49.653837101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.245245 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.245590 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.745581932 +0000 UTC m=+49.654321864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.345949 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.346062 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.84603704 +0000 UTC m=+49.754776962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.346353 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.347193 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.847180702 +0000 UTC m=+49.755920634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.365383 4720 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.450122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.450339 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.950312154 +0000 UTC m=+49.859052086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.451396 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.451768 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:11.951755074 +0000 UTC m=+49.860495006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.552107 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.552263 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.052233472 +0000 UTC m=+49.960973404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.552374 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.552686 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.052670724 +0000 UTC m=+49.961410656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.655495 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.655812 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.155797466 +0000 UTC m=+50.064537398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.680248 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.680950 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.690482 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.690621 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.694855 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.761275 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.761333 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.761420 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.761710 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.261698166 +0000 UTC m=+50.170438098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6kjwf" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.864513 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.864874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.864912 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: E0121 14:30:11.865364 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-21 14:30:12.365345043 +0000 UTC m=+50.274084985 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.865402 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.885100 4720 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-21T14:30:11.365409613Z","Handler":null,"Name":""} Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.896370 4720 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.896408 4720 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.920269 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.969348 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.972708 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.974341 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.979002 4720 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.979048 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.983605 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.984001 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 21 14:30:11 crc kubenswrapper[4720]: I0121 14:30:11.985860 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.031700 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.078424 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.078474 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.123029 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:12 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:12 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:12 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.123079 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.180443 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.180485 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.180858 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.184297 4720 generic.go:334] "Generic (PLEG): container finished" podID="306f9668-a044-448f-a14f-81c9726d3008" containerID="23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475" exitCode=0 Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.184357 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475"} Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.237961 4720 generic.go:334] "Generic (PLEG): container finished" podID="328ecaa4-59eb-4707-a320-245636d0c778" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" exitCode=0 Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.238073 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744"} Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.239944 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.272930 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6kjwf\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.279487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"eb9a838c233ac9fd43524b7ce216d0485f99ae1db53c573564d9447916affa15"} Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.282116 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.315236 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.342737 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.506420 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.714784 4720 patch_prober.go:28] interesting pod/apiserver-76f77b778f-pm8dm container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]log ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]etcd ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/generic-apiserver-start-informers ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/max-in-flight-filter ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 21 14:30:12 crc kubenswrapper[4720]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/project.openshift.io-projectcache ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-startinformers ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 21 14:30:12 crc kubenswrapper[4720]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 21 14:30:12 crc kubenswrapper[4720]: livez check failed Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.714834 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" podUID="0f685084-f748-4a34-9020-4d562f2a6d45" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.715775 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.719976 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.720014 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:12 crc kubenswrapper[4720]: I0121 14:30:12.889530 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.024891 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.084215 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.099789 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-njjgs" Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.125973 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:13 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:13 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:13 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.126055 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.411059 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-j577t" event={"ID":"e675e6aa-6d61-4490-b768-1dbe664d1dfe","Type":"ContainerStarted","Data":"1d976a40674b8dfa980b1373ae0e6473bf7974dd711406c75edbb43b7bcd7b54"} Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.446909 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerStarted","Data":"75d81f40cb1aa80276ca163d8e80d21e159b813861ad79754529e04e661489ce"} Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.449580 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerStarted","Data":"cc78447803378e22f6cbae3e9270bdc6d0ee1630fceb9cd43ec6c839a71ce985"} Jan 21 14:30:13 crc kubenswrapper[4720]: I0121 14:30:13.450245 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerStarted","Data":"bcd668709459a7d9349dcca3b2f18a9e0573e3a3a954d0f356bfcecf5d43778f"} Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.124026 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:14 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:14 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:14 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.125155 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.473325 4720 generic.go:334] "Generic (PLEG): container finished" podID="c48951e9-42eb-461f-812e-adc413405821" containerID="24914b76a0e5210499019f7f0b2d263f162c0daf747c7bb929ce8a0cf24ad2a4" exitCode=0 Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.473394 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerDied","Data":"24914b76a0e5210499019f7f0b2d263f162c0daf747c7bb929ce8a0cf24ad2a4"} Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.545048 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-j577t" podStartSLOduration=19.545016921 podStartE2EDuration="19.545016921s" podCreationTimestamp="2026-01-21 14:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:14.542972523 +0000 UTC m=+52.451712475" watchObservedRunningTime="2026-01-21 14:30:14.545016921 +0000 UTC m=+52.453756853" Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.778311 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 21 14:30:14 crc kubenswrapper[4720]: I0121 14:30:14.814065 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.125855 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:15 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:15 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:15 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.126119 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.540942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerStarted","Data":"e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d"} Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.541087 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.546188 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerStarted","Data":"780a7d93308f4ebd077233412e2b3ef3d2859a907091858637f7731a5de212e5"} Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.549053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerStarted","Data":"bdb6a29b95e5e8d428c82dd29717231f2fc0234e21a7718e8c673c84ad6c02ec"} Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.570054 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" podStartSLOduration=33.570033626 podStartE2EDuration="33.570033626s" podCreationTimestamp="2026-01-21 14:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.568155523 +0000 UTC m=+53.476895455" watchObservedRunningTime="2026-01-21 14:30:15.570033626 +0000 UTC m=+53.478773558" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.593588 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.593553185 podStartE2EDuration="1.593553185s" podCreationTimestamp="2026-01-21 14:30:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.583167044 +0000 UTC m=+53.491906986" watchObservedRunningTime="2026-01-21 14:30:15.593553185 +0000 UTC m=+53.502293137" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.630626 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.630611684 podStartE2EDuration="4.630611684s" podCreationTimestamp="2026-01-21 14:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.628640569 +0000 UTC m=+53.537380511" watchObservedRunningTime="2026-01-21 14:30:15.630611684 +0000 UTC m=+53.539351616" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.646208 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=4.646193301 podStartE2EDuration="4.646193301s" podCreationTimestamp="2026-01-21 14:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:15.643938918 +0000 UTC m=+53.552678870" watchObservedRunningTime="2026-01-21 14:30:15.646193301 +0000 UTC m=+53.554933233" Jan 21 14:30:15 crc kubenswrapper[4720]: I0121 14:30:15.905762 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.103428 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") pod \"c48951e9-42eb-461f-812e-adc413405821\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.103479 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") pod \"c48951e9-42eb-461f-812e-adc413405821\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.103573 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") pod \"c48951e9-42eb-461f-812e-adc413405821\" (UID: \"c48951e9-42eb-461f-812e-adc413405821\") " Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.104542 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume" (OuterVolumeSpecName: "config-volume") pod "c48951e9-42eb-461f-812e-adc413405821" (UID: "c48951e9-42eb-461f-812e-adc413405821"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.116634 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c48951e9-42eb-461f-812e-adc413405821" (UID: "c48951e9-42eb-461f-812e-adc413405821"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.128485 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:16 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:16 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:16 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.128562 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.130361 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h" (OuterVolumeSpecName: "kube-api-access-4rl6h") pod "c48951e9-42eb-461f-812e-adc413405821" (UID: "c48951e9-42eb-461f-812e-adc413405821"). InnerVolumeSpecName "kube-api-access-4rl6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.204927 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c48951e9-42eb-461f-812e-adc413405821-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.204965 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rl6h\" (UniqueName: \"kubernetes.io/projected/c48951e9-42eb-461f-812e-adc413405821-kube-api-access-4rl6h\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.204979 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c48951e9-42eb-461f-812e-adc413405821-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.561633 4720 generic.go:334] "Generic (PLEG): container finished" podID="38609d5a-a946-4abf-8b84-2e90a636844a" containerID="bdb6a29b95e5e8d428c82dd29717231f2fc0234e21a7718e8c673c84ad6c02ec" exitCode=0 Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.561708 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerDied","Data":"bdb6a29b95e5e8d428c82dd29717231f2fc0234e21a7718e8c673c84ad6c02ec"} Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.598092 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.598085 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483430-jcp9p" event={"ID":"c48951e9-42eb-461f-812e-adc413405821","Type":"ContainerDied","Data":"651c91098a6b5beb1bb69833f5373a6ae3cd82dd60030a87f4a9c3ad1187b846"} Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.598324 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="651c91098a6b5beb1bb69833f5373a6ae3cd82dd60030a87f4a9c3ad1187b846" Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.632861 4720 generic.go:334] "Generic (PLEG): container finished" podID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerID="780a7d93308f4ebd077233412e2b3ef3d2859a907091858637f7731a5de212e5" exitCode=0 Jan 21 14:30:16 crc kubenswrapper[4720]: I0121 14:30:16.633030 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerDied","Data":"780a7d93308f4ebd077233412e2b3ef3d2859a907091858637f7731a5de212e5"} Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.122329 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:17 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:17 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:17 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.122386 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175176 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175235 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175920 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.175953 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.178083 4720 patch_prober.go:28] interesting pod/downloads-7954f5f757-wmxb9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.178108 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wmxb9" podUID="120bd3b2-5437-4a15-bcc4-32ae06eb7f1f" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.707566 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:17 crc kubenswrapper[4720]: I0121 14:30:17.727776 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-pm8dm" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.061780 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.121533 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:18 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:18 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:18 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.121599 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244437 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") pod \"0389f8b8-4893-4619-b91f-0f2ef883fd85\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244513 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0389f8b8-4893-4619-b91f-0f2ef883fd85" (UID: "0389f8b8-4893-4619-b91f-0f2ef883fd85"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244588 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") pod \"0389f8b8-4893-4619-b91f-0f2ef883fd85\" (UID: \"0389f8b8-4893-4619-b91f-0f2ef883fd85\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.244996 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0389f8b8-4893-4619-b91f-0f2ef883fd85-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.253922 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0389f8b8-4893-4619-b91f-0f2ef883fd85" (UID: "0389f8b8-4893-4619-b91f-0f2ef883fd85"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.347592 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0389f8b8-4893-4619-b91f-0f2ef883fd85-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.356460 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.359304 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.551681 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") pod \"38609d5a-a946-4abf-8b84-2e90a636844a\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.551741 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") pod \"38609d5a-a946-4abf-8b84-2e90a636844a\" (UID: \"38609d5a-a946-4abf-8b84-2e90a636844a\") " Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.553056 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "38609d5a-a946-4abf-8b84-2e90a636844a" (UID: "38609d5a-a946-4abf-8b84-2e90a636844a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.564169 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "38609d5a-a946-4abf-8b84-2e90a636844a" (UID: "38609d5a-a946-4abf-8b84-2e90a636844a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.627397 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.635757 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.638962 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:18 crc kubenswrapper[4720]: E0121 14:30:18.639017 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.677536 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/38609d5a-a946-4abf-8b84-2e90a636844a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.677570 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/38609d5a-a946-4abf-8b84-2e90a636844a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.736725 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.758607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0389f8b8-4893-4619-b91f-0f2ef883fd85","Type":"ContainerDied","Data":"bcd668709459a7d9349dcca3b2f18a9e0573e3a3a954d0f356bfcecf5d43778f"} Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.758873 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd668709459a7d9349dcca3b2f18a9e0573e3a3a954d0f356bfcecf5d43778f" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.807434 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.807797 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"38609d5a-a946-4abf-8b84-2e90a636844a","Type":"ContainerDied","Data":"75d81f40cb1aa80276ca163d8e80d21e159b813861ad79754529e04e661489ce"} Jan 21 14:30:18 crc kubenswrapper[4720]: I0121 14:30:18.807826 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75d81f40cb1aa80276ca163d8e80d21e159b813861ad79754529e04e661489ce" Jan 21 14:30:19 crc kubenswrapper[4720]: I0121 14:30:19.121300 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:19 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:19 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:19 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:19 crc kubenswrapper[4720]: I0121 14:30:19.121346 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:20 crc kubenswrapper[4720]: I0121 14:30:20.121479 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:20 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:20 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:20 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:20 crc kubenswrapper[4720]: I0121 14:30:20.121526 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:21 crc kubenswrapper[4720]: I0121 14:30:21.121084 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:21 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:21 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:21 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:21 crc kubenswrapper[4720]: I0121 14:30:21.121397 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:22 crc kubenswrapper[4720]: I0121 14:30:22.121168 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:22 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:22 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:22 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:22 crc kubenswrapper[4720]: I0121 14:30:22.121248 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:23 crc kubenswrapper[4720]: I0121 14:30:23.122082 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:23 crc kubenswrapper[4720]: [-]has-synced failed: reason withheld Jan 21 14:30:23 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:23 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:23 crc kubenswrapper[4720]: I0121 14:30:23.122163 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:24 crc kubenswrapper[4720]: I0121 14:30:24.121679 4720 patch_prober.go:28] interesting pod/router-default-5444994796-5qcz5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 21 14:30:24 crc kubenswrapper[4720]: [+]has-synced ok Jan 21 14:30:24 crc kubenswrapper[4720]: [+]process-running ok Jan 21 14:30:24 crc kubenswrapper[4720]: healthz check failed Jan 21 14:30:24 crc kubenswrapper[4720]: I0121 14:30:24.121757 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-5qcz5" podUID="f55572f9-fbba-4efa-a6a8-94884f06f9c3" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:30:25 crc kubenswrapper[4720]: I0121 14:30:25.122899 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:25 crc kubenswrapper[4720]: I0121 14:30:25.128134 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-5qcz5" Jan 21 14:30:27 crc kubenswrapper[4720]: I0121 14:30:27.167874 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 21 14:30:27 crc kubenswrapper[4720]: I0121 14:30:27.168206 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 21 14:30:27 crc kubenswrapper[4720]: I0121 14:30:27.181244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wmxb9" Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.599817 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.601337 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.604102 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:28 crc kubenswrapper[4720]: E0121 14:30:28.604151 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:32 crc kubenswrapper[4720]: I0121 14:30:32.511844 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:30:37 crc kubenswrapper[4720]: I0121 14:30:37.144761 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 21 14:30:37 crc kubenswrapper[4720]: I0121 14:30:37.344438 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:37 crc kubenswrapper[4720]: I0121 14:30:37.358070 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:30:38 crc kubenswrapper[4720]: I0121 14:30:38.558035 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pm55" Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.593163 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.594548 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.595854 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:38 crc kubenswrapper[4720]: E0121 14:30:38.595915 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:41 crc kubenswrapper[4720]: I0121 14:30:41.371049 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nwj8k_75c0e088-7bdf-47f4-b434-b184e742d40a/kube-multus-additional-cni-plugins/0.log" Jan 21 14:30:41 crc kubenswrapper[4720]: I0121 14:30:41.371985 4720 generic.go:334] "Generic (PLEG): container finished" podID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" exitCode=137 Jan 21 14:30:41 crc kubenswrapper[4720]: I0121 14:30:41.372117 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerDied","Data":"316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55"} Jan 21 14:30:46 crc kubenswrapper[4720]: I0121 14:30:46.701161 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.590677 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.592111 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.592446 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:48 crc kubenswrapper[4720]: E0121 14:30:48.592504 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282252 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 14:30:51 crc kubenswrapper[4720]: E0121 14:30:51.282830 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38609d5a-a946-4abf-8b84-2e90a636844a" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282848 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="38609d5a-a946-4abf-8b84-2e90a636844a" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: E0121 14:30:51.282869 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48951e9-42eb-461f-812e-adc413405821" containerName="collect-profiles" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282875 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48951e9-42eb-461f-812e-adc413405821" containerName="collect-profiles" Jan 21 14:30:51 crc kubenswrapper[4720]: E0121 14:30:51.282887 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.282894 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283005 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48951e9-42eb-461f-812e-adc413405821" containerName="collect-profiles" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283026 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="38609d5a-a946-4abf-8b84-2e90a636844a" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283038 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0389f8b8-4893-4619-b91f-0f2ef883fd85" containerName="pruner" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.283441 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.286460 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.286820 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.290757 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.319343 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.319321608 podStartE2EDuration="5.319321608s" podCreationTimestamp="2026-01-21 14:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:30:51.317268047 +0000 UTC m=+89.226007989" watchObservedRunningTime="2026-01-21 14:30:51.319321608 +0000 UTC m=+89.228061540" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.323868 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.324009 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.424781 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.424849 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.424925 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.450505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:51 crc kubenswrapper[4720]: I0121 14:30:51.609994 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.271343 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.272477 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.285372 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.286151 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.286614 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.293168 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.388749 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389238 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389471 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389358 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.389524 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.411403 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"installer-9-crc\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:56 crc kubenswrapper[4720]: I0121 14:30:56.601754 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:30:57 crc kubenswrapper[4720]: E0121 14:30:57.819876 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 21 14:30:57 crc kubenswrapper[4720]: E0121 14:30:57.820338 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-swm4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fwhvj_openshift-marketplace(d436685f-1f7d-454b-afa4-76389c5c5ff4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:30:57 crc kubenswrapper[4720]: E0121 14:30:57.821548 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fwhvj" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.590526 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.590939 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.591263 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:30:58 crc kubenswrapper[4720]: E0121 14:30:58.591321 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:01 crc kubenswrapper[4720]: E0121 14:31:01.460178 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fwhvj" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" Jan 21 14:31:06 crc kubenswrapper[4720]: E0121 14:31:06.545214 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 21 14:31:06 crc kubenswrapper[4720]: E0121 14:31:06.545744 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4szdg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-c95rn_openshift-marketplace(8432f9d9-0168-4b49-b6a7-66281f46bd5a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:06 crc kubenswrapper[4720]: E0121 14:31:06.548243 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-c95rn" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591024 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591674 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591897 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" cmd=["/bin/bash","-c","test -f /ready/ready"] Jan 21 14:31:08 crc kubenswrapper[4720]: E0121 14:31:08.591968 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:09 crc kubenswrapper[4720]: E0121 14:31:09.201303 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 21 14:31:09 crc kubenswrapper[4720]: E0121 14:31:09.201484 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ql9d6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jbtfr_openshift-marketplace(aa280405-236d-4a24-896d-04a2dfad8a3a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:09 crc kubenswrapper[4720]: E0121 14:31:09.202711 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jbtfr" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" Jan 21 14:31:10 crc kubenswrapper[4720]: E0121 14:31:10.872749 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jbtfr" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" Jan 21 14:31:10 crc kubenswrapper[4720]: E0121 14:31:10.873368 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-c95rn" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.265622 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.265816 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmfz4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v6vwc_openshift-marketplace(1d6131a5-b63e-42a5-905a-9ed5350a421a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.267271 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v6vwc" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.327936 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.328136 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kxcbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lt46m_openshift-marketplace(7bb4c793-0d05-43f9-a9ad-30d9b6b40595): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.329397 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.428558 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.428709 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfn9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5qbdf_openshift-marketplace(4bbb0e48-d287-42fc-a165-86038d2083c9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:13 crc kubenswrapper[4720]: E0121 14:31:13.430119 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5qbdf" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.690313 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.690646 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5qbdf" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.709342 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v6vwc" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.731787 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.731954 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mmzpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-x7575_openshift-marketplace(328ecaa4-59eb-4707-a320-245636d0c778): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.733978 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-x7575" podUID="328ecaa4-59eb-4707-a320-245636d0c778" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.763790 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.763901 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t4pfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-52n8k_openshift-marketplace(306f9668-a044-448f-a14f-81c9726d3008): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 21 14:31:15 crc kubenswrapper[4720]: E0121 14:31:15.765165 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.765876 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nwj8k_75c0e088-7bdf-47f4-b434-b184e742d40a/kube-multus-additional-cni-plugins/0.log" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.765929 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951139 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951555 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951575 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") pod \"75c0e088-7bdf-47f4-b434-b184e742d40a\" (UID: \"75c0e088-7bdf-47f4-b434-b184e742d40a\") " Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951801 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.951910 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready" (OuterVolumeSpecName: "ready") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.952054 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:31:15 crc kubenswrapper[4720]: I0121 14:31:15.961381 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8" (OuterVolumeSpecName: "kube-api-access-2cdm8") pod "75c0e088-7bdf-47f4-b434-b184e742d40a" (UID: "75c0e088-7bdf-47f4-b434-b184e742d40a"). InnerVolumeSpecName "kube-api-access-2cdm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052590 4720 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/75c0e088-7bdf-47f4-b434-b184e742d40a-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052637 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdm8\" (UniqueName: \"kubernetes.io/projected/75c0e088-7bdf-47f4-b434-b184e742d40a-kube-api-access-2cdm8\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052646 4720 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/75c0e088-7bdf-47f4-b434-b184e742d40a-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.052675 4720 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/75c0e088-7bdf-47f4-b434-b184e742d40a-ready\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.149006 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 21 14:31:16 crc kubenswrapper[4720]: W0121 14:31:16.160219 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd3bb0d67_7131_40e1_818d_5d4fd5c1a725.slice/crio-ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a WatchSource:0}: Error finding container ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a: Status 404 returned error can't find the container with id ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.170588 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 21 14:31:16 crc kubenswrapper[4720]: W0121 14:31:16.186123 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d8131be_bd51_4ed7_bb5c_57990adf304a.slice/crio-f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce WatchSource:0}: Error finding container f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce: Status 404 returned error can't find the container with id f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.213894 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-nwj8k_75c0e088-7bdf-47f4-b434-b184e742d40a/kube-multus-additional-cni-plugins/0.log" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.213955 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" event={"ID":"75c0e088-7bdf-47f4-b434-b184e742d40a","Type":"ContainerDied","Data":"606f33407deb43968f7cc7f66c83d922a2e45672a6ac0cad952ee6a566842321"} Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.213988 4720 scope.go:117] "RemoveContainer" containerID="316c44a71814ef51610b90808d48af06d2a280f1407ff9534d777f78d84e3a55" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.214074 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-nwj8k" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.217982 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerStarted","Data":"ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a"} Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.219586 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8d8131be-bd51-4ed7-bb5c-57990adf304a","Type":"ContainerStarted","Data":"f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce"} Jan 21 14:31:16 crc kubenswrapper[4720]: E0121 14:31:16.221262 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-x7575" podUID="328ecaa4-59eb-4707-a320-245636d0c778" Jan 21 14:31:16 crc kubenswrapper[4720]: E0121 14:31:16.221393 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.255644 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.259593 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-nwj8k"] Jan 21 14:31:16 crc kubenswrapper[4720]: I0121 14:31:16.686054 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" path="/var/lib/kubelet/pods/75c0e088-7bdf-47f4-b434-b184e742d40a/volumes" Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.226557 4720 generic.go:334] "Generic (PLEG): container finished" podID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerID="2512271d020bfe9083ce97421060dd72da178b6e1eacc8d10a11852e7a71fefd" exitCode=0 Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.226604 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8d8131be-bd51-4ed7-bb5c-57990adf304a","Type":"ContainerDied","Data":"2512271d020bfe9083ce97421060dd72da178b6e1eacc8d10a11852e7a71fefd"} Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.233557 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerStarted","Data":"7ae518d6f1ac52dac7a894b823c50d52751d81a944e32a3cdcc1dc5e572fb00e"} Jan 21 14:31:17 crc kubenswrapper[4720]: I0121 14:31:17.263406 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=21.263385149 podStartE2EDuration="21.263385149s" podCreationTimestamp="2026-01-21 14:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:31:17.259519534 +0000 UTC m=+115.168259486" watchObservedRunningTime="2026-01-21 14:31:17.263385149 +0000 UTC m=+115.172125101" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.241496 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerStarted","Data":"0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739"} Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.513859 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587563 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") pod \"8d8131be-bd51-4ed7-bb5c-57990adf304a\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587694 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") pod \"8d8131be-bd51-4ed7-bb5c-57990adf304a\" (UID: \"8d8131be-bd51-4ed7-bb5c-57990adf304a\") " Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587742 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d8131be-bd51-4ed7-bb5c-57990adf304a" (UID: "8d8131be-bd51-4ed7-bb5c-57990adf304a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.587877 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d8131be-bd51-4ed7-bb5c-57990adf304a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.596472 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d8131be-bd51-4ed7-bb5c-57990adf304a" (UID: "8d8131be-bd51-4ed7-bb5c-57990adf304a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:18 crc kubenswrapper[4720]: I0121 14:31:18.689682 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d8131be-bd51-4ed7-bb5c-57990adf304a-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.249219 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"8d8131be-bd51-4ed7-bb5c-57990adf304a","Type":"ContainerDied","Data":"f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce"} Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.249904 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b4b3b818f563e853029353849c777c7f1d1e47810bde0bf68f24904d8312ce" Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.249214 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.254039 4720 generic.go:334] "Generic (PLEG): container finished" podID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerID="0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739" exitCode=0 Jan 21 14:31:19 crc kubenswrapper[4720]: I0121 14:31:19.254129 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739"} Jan 21 14:31:20 crc kubenswrapper[4720]: I0121 14:31:20.261027 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerStarted","Data":"87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567"} Jan 21 14:31:21 crc kubenswrapper[4720]: I0121 14:31:21.694725 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fwhvj" podStartSLOduration=4.73684317 podStartE2EDuration="1m15.694704063s" podCreationTimestamp="2026-01-21 14:30:06 +0000 UTC" firstStartedPulling="2026-01-21 14:30:08.967842637 +0000 UTC m=+46.876582569" lastFinishedPulling="2026-01-21 14:31:19.92570353 +0000 UTC m=+117.834443462" observedRunningTime="2026-01-21 14:31:20.283848153 +0000 UTC m=+118.192588145" watchObservedRunningTime="2026-01-21 14:31:21.694704063 +0000 UTC m=+119.603444005" Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.283507 4720 generic.go:334] "Generic (PLEG): container finished" podID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" exitCode=0 Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.283560 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37"} Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.292257 4720 generic.go:334] "Generic (PLEG): container finished" podID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" exitCode=0 Jan 21 14:31:24 crc kubenswrapper[4720]: I0121 14:31:24.292309 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98"} Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.308592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerStarted","Data":"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc"} Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.311691 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerStarted","Data":"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74"} Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.330185 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c95rn" podStartSLOduration=4.092685237 podStartE2EDuration="1m19.330161719s" podCreationTimestamp="2026-01-21 14:30:07 +0000 UTC" firstStartedPulling="2026-01-21 14:30:10.014396876 +0000 UTC m=+47.923136798" lastFinishedPulling="2026-01-21 14:31:25.251873348 +0000 UTC m=+123.160613280" observedRunningTime="2026-01-21 14:31:26.326230872 +0000 UTC m=+124.234970804" watchObservedRunningTime="2026-01-21 14:31:26.330161719 +0000 UTC m=+124.238901661" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.347290 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jbtfr" podStartSLOduration=3.058653801 podStartE2EDuration="1m18.347251269s" podCreationTimestamp="2026-01-21 14:30:08 +0000 UTC" firstStartedPulling="2026-01-21 14:30:09.986444672 +0000 UTC m=+47.895184604" lastFinishedPulling="2026-01-21 14:31:25.27504214 +0000 UTC m=+123.183782072" observedRunningTime="2026-01-21 14:31:26.344465306 +0000 UTC m=+124.253205248" watchObservedRunningTime="2026-01-21 14:31:26.347251269 +0000 UTC m=+124.255991201" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.489253 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.489521 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:26 crc kubenswrapper[4720]: I0121 14:31:26.598921 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:27 crc kubenswrapper[4720]: I0121 14:31:27.354251 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.367488 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.367648 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.427550 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.638557 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.638689 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.674039 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:28 crc kubenswrapper[4720]: I0121 14:31:28.831238 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:31:29 crc kubenswrapper[4720]: I0121 14:31:29.329078 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerStarted","Data":"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62"} Jan 21 14:31:30 crc kubenswrapper[4720]: I0121 14:31:30.334616 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fwhvj" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" containerID="cri-o://87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567" gracePeriod=2 Jan 21 14:31:32 crc kubenswrapper[4720]: I0121 14:31:32.343117 4720 generic.go:334] "Generic (PLEG): container finished" podID="328ecaa4-59eb-4707-a320-245636d0c778" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" exitCode=0 Jan 21 14:31:32 crc kubenswrapper[4720]: I0121 14:31:32.343202 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62"} Jan 21 14:31:33 crc kubenswrapper[4720]: I0121 14:31:33.350353 4720 generic.go:334] "Generic (PLEG): container finished" podID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerID="87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567" exitCode=0 Jan 21 14:31:33 crc kubenswrapper[4720]: I0121 14:31:33.350398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567"} Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.453607 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.486544 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") pod \"d436685f-1f7d-454b-afa4-76389c5c5ff4\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.486603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") pod \"d436685f-1f7d-454b-afa4-76389c5c5ff4\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.486668 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") pod \"d436685f-1f7d-454b-afa4-76389c5c5ff4\" (UID: \"d436685f-1f7d-454b-afa4-76389c5c5ff4\") " Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.488194 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities" (OuterVolumeSpecName: "utilities") pod "d436685f-1f7d-454b-afa4-76389c5c5ff4" (UID: "d436685f-1f7d-454b-afa4-76389c5c5ff4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.493983 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t" (OuterVolumeSpecName: "kube-api-access-swm4t") pod "d436685f-1f7d-454b-afa4-76389c5c5ff4" (UID: "d436685f-1f7d-454b-afa4-76389c5c5ff4"). InnerVolumeSpecName "kube-api-access-swm4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.544000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d436685f-1f7d-454b-afa4-76389c5c5ff4" (UID: "d436685f-1f7d-454b-afa4-76389c5c5ff4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.587495 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.587766 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swm4t\" (UniqueName: \"kubernetes.io/projected/d436685f-1f7d-454b-afa4-76389c5c5ff4-kube-api-access-swm4t\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:34 crc kubenswrapper[4720]: I0121 14:31:34.587862 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d436685f-1f7d-454b-afa4-76389c5c5ff4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.364887 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fwhvj" event={"ID":"d436685f-1f7d-454b-afa4-76389c5c5ff4","Type":"ContainerDied","Data":"f92665f685bf80e17e3e48269da656cf92cd51a8b00c063a085e7a0052993aa3"} Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.365220 4720 scope.go:117] "RemoveContainer" containerID="87a9c41acb6766e559f32afd2cc2050647f1ed415f0920e717d72e495a698567" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.365007 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fwhvj" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.382234 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.384859 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fwhvj"] Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.395333 4720 scope.go:117] "RemoveContainer" containerID="0f822c28937b4fe73524abfab4f4eed108c52c489cfad83961a00b3c3e26a739" Jan 21 14:31:35 crc kubenswrapper[4720]: I0121 14:31:35.422381 4720 scope.go:117] "RemoveContainer" containerID="fea576e42ea53daf64f9e355cf2971b7c48351b927096e3397ea48c46de4d07f" Jan 21 14:31:36 crc kubenswrapper[4720]: I0121 14:31:36.685045 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" path="/var/lib/kubelet/pods/d436685f-1f7d-454b-afa4-76389c5c5ff4/volumes" Jan 21 14:31:38 crc kubenswrapper[4720]: I0121 14:31:38.405308 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:31:38 crc kubenswrapper[4720]: I0121 14:31:38.676043 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.618558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerStarted","Data":"828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.623920 4720 generic.go:334] "Generic (PLEG): container finished" podID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" exitCode=0 Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.624179 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.628453 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerStarted","Data":"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.632158 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerStarted","Data":"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.644690 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272"} Jan 21 14:31:39 crc kubenswrapper[4720]: I0121 14:31:39.760265 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x7575" podStartSLOduration=5.21038437 podStartE2EDuration="1m31.760243675s" podCreationTimestamp="2026-01-21 14:30:08 +0000 UTC" firstStartedPulling="2026-01-21 14:30:12.274929318 +0000 UTC m=+50.183669250" lastFinishedPulling="2026-01-21 14:31:38.824788583 +0000 UTC m=+136.733528555" observedRunningTime="2026-01-21 14:31:39.723712446 +0000 UTC m=+137.632452408" watchObservedRunningTime="2026-01-21 14:31:39.760243675 +0000 UTC m=+137.668983607" Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.630994 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.631218 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jbtfr" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" containerID="cri-o://726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" gracePeriod=2 Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.691945 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerID="828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d" exitCode=0 Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.692192 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d"} Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.697264 4720 generic.go:334] "Generic (PLEG): container finished" podID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" exitCode=0 Jan 21 14:31:40 crc kubenswrapper[4720]: I0121 14:31:40.697319 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.510544 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.594648 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") pod \"aa280405-236d-4a24-896d-04a2dfad8a3a\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.594708 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") pod \"aa280405-236d-4a24-896d-04a2dfad8a3a\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.594734 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") pod \"aa280405-236d-4a24-896d-04a2dfad8a3a\" (UID: \"aa280405-236d-4a24-896d-04a2dfad8a3a\") " Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.595711 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities" (OuterVolumeSpecName: "utilities") pod "aa280405-236d-4a24-896d-04a2dfad8a3a" (UID: "aa280405-236d-4a24-896d-04a2dfad8a3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.609849 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6" (OuterVolumeSpecName: "kube-api-access-ql9d6") pod "aa280405-236d-4a24-896d-04a2dfad8a3a" (UID: "aa280405-236d-4a24-896d-04a2dfad8a3a"). InnerVolumeSpecName "kube-api-access-ql9d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.636395 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa280405-236d-4a24-896d-04a2dfad8a3a" (UID: "aa280405-236d-4a24-896d-04a2dfad8a3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.695789 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.695834 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql9d6\" (UniqueName: \"kubernetes.io/projected/aa280405-236d-4a24-896d-04a2dfad8a3a-kube-api-access-ql9d6\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.695849 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa280405-236d-4a24-896d-04a2dfad8a3a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718322 4720 generic.go:334] "Generic (PLEG): container finished" podID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" exitCode=0 Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718387 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jbtfr" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718413 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718469 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jbtfr" event={"ID":"aa280405-236d-4a24-896d-04a2dfad8a3a","Type":"ContainerDied","Data":"79477e1af1d10f20ff2bf7e280a7ee476108ea069779af5f5cdc35424364da3b"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.718487 4720 scope.go:117] "RemoveContainer" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.721066 4720 generic.go:334] "Generic (PLEG): container finished" podID="306f9668-a044-448f-a14f-81c9726d3008" containerID="359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272" exitCode=0 Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.721120 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.728034 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerStarted","Data":"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23"} Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.741911 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.745798 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jbtfr"] Jan 21 14:31:42 crc kubenswrapper[4720]: I0121 14:31:42.777148 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v6vwc" podStartSLOduration=6.188751924 podStartE2EDuration="1m37.777126856s" podCreationTimestamp="2026-01-21 14:30:05 +0000 UTC" firstStartedPulling="2026-01-21 14:30:08.862541154 +0000 UTC m=+46.771281096" lastFinishedPulling="2026-01-21 14:31:40.450916096 +0000 UTC m=+138.359656028" observedRunningTime="2026-01-21 14:31:42.776430263 +0000 UTC m=+140.685170215" watchObservedRunningTime="2026-01-21 14:31:42.777126856 +0000 UTC m=+140.685866798" Jan 21 14:31:43 crc kubenswrapper[4720]: I0121 14:31:43.107487 4720 scope.go:117] "RemoveContainer" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" Jan 21 14:31:44 crc kubenswrapper[4720]: I0121 14:31:44.683627 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" path="/var/lib/kubelet/pods/aa280405-236d-4a24-896d-04a2dfad8a3a/volumes" Jan 21 14:31:45 crc kubenswrapper[4720]: I0121 14:31:45.871080 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:45 crc kubenswrapper[4720]: I0121 14:31:45.871948 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:45 crc kubenswrapper[4720]: I0121 14:31:45.920905 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.077011 4720 scope.go:117] "RemoveContainer" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.094574 4720 scope.go:117] "RemoveContainer" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" Jan 21 14:31:46 crc kubenswrapper[4720]: E0121 14:31:46.095807 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74\": container with ID starting with 726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74 not found: ID does not exist" containerID="726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.095905 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74"} err="failed to get container status \"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74\": rpc error: code = NotFound desc = could not find container \"726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74\": container with ID starting with 726833c5340eadfc726a17be564e29850dafa50a93c6d0722e08950ba9a91f74 not found: ID does not exist" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.096006 4720 scope.go:117] "RemoveContainer" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" Jan 21 14:31:46 crc kubenswrapper[4720]: E0121 14:31:46.096463 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37\": container with ID starting with d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37 not found: ID does not exist" containerID="d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.096549 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37"} err="failed to get container status \"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37\": rpc error: code = NotFound desc = could not find container \"d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37\": container with ID starting with d984e2a72a467830151f365cfcb9b92dbb5585bf3537764c63136e4213e8bf37 not found: ID does not exist" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.096633 4720 scope.go:117] "RemoveContainer" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" Jan 21 14:31:46 crc kubenswrapper[4720]: E0121 14:31:46.097126 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a\": container with ID starting with c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a not found: ID does not exist" containerID="c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.097252 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a"} err="failed to get container status \"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a\": rpc error: code = NotFound desc = could not find container \"c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a\": container with ID starting with c1a8d2ebb0742bda9eb8111d769438c7e4a1020d8851b592dee2b79a7eae798a not found: ID does not exist" Jan 21 14:31:46 crc kubenswrapper[4720]: I0121 14:31:46.784896 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.040915 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.041252 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.081407 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.140119 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8"] Jan 21 14:31:49 crc kubenswrapper[4720]: I0121 14:31:49.805555 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:31:52 crc kubenswrapper[4720]: I0121 14:31:52.880322 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:31:52 crc kubenswrapper[4720]: I0121 14:31:52.880388 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053536 4720 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.053910 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerName="pruner" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053930 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerName="pruner" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.053952 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053964 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.053983 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.053995 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054019 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054032 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054052 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054065 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-content" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054082 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054093 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054111 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054123 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.054144 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054156 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="extract-utilities" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054338 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c0e088-7bdf-47f4-b434-b184e742d40a" containerName="kube-multus-additional-cni-plugins" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054358 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d436685f-1f7d-454b-afa4-76389c5c5ff4" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054373 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa280405-236d-4a24-896d-04a2dfad8a3a" containerName="registry-server" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054387 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8131be-bd51-4ed7-bb5c-57990adf304a" containerName="pruner" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.054905 4720 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055053 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055471 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055594 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055727 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055642 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.055513 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" gracePeriod=15 Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.057496 4720 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.058834 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.058871 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.058958 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.058977 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059189 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059262 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059287 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059355 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059389 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059455 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.059484 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.059502 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060010 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060044 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060073 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060095 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060120 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: E0121 14:31:54.060359 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060382 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.060702 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.095345 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252703 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252793 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.252902 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253451 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253497 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253555 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.253592 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.354985 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.355409 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.355123 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.355601 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356017 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356250 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356466 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356405 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356193 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.356610 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357217 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357161 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357613 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.357428 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:54 crc kubenswrapper[4720]: I0121 14:31:54.386752 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.797706 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.799816 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.800345 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.800710 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.800921 4720 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.800949 4720 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 21 14:31:55 crc kubenswrapper[4720]: E0121 14:31:55.801118 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="200ms" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.804877 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.807057 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808505 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808543 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808552 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808563 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" exitCode=2 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.808605 4720 scope.go:117] "RemoveContainer" containerID="f11ff07fdbcacd193207581f957a529571688ff0d97b7a999b16778f106da754" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.810875 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerID="7ae518d6f1ac52dac7a894b823c50d52751d81a944e32a3cdcc1dc5e572fb00e" exitCode=0 Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.810911 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerDied","Data":"7ae518d6f1ac52dac7a894b823c50d52751d81a944e32a3cdcc1dc5e572fb00e"} Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.811650 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:55 crc kubenswrapper[4720]: I0121 14:31:55.812282 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: E0121 14:31:56.002626 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="400ms" Jan 21 14:31:56 crc kubenswrapper[4720]: E0121 14:31:56.403318 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="800ms" Jan 21 14:31:56 crc kubenswrapper[4720]: E0121 14:31:56.576917 4720 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-52n8k.188cc5822f10d1b9 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-52n8k,UID:306f9668-a044-448f-a14f-81c9726d3008,APIVersion:v1,ResourceVersion:28349,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 13.321s (13.321s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,LastTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 14:31:56 crc kubenswrapper[4720]: W0121 14:31:56.659362 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644 WatchSource:0}: Error finding container 5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644: Status 404 returned error can't find the container with id 5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644 Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.831421 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.850338 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5c6ec6c44485fa4df4f4f73e95465073ab06449027ba0b4566b58774a06e0644"} Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.875770 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerStarted","Data":"9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3"} Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.882773 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.882959 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.883120 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.913169 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerStarted","Data":"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97"} Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.914367 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.914725 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.915580 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:56 crc kubenswrapper[4720]: I0121 14:31:56.915880 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: E0121 14:31:57.204475 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="1.6s" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.262967 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.263918 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.264347 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.264782 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.265065 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297226 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") pod \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297282 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") pod \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297308 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") pod \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\" (UID: \"d3bb0d67-7131-40e1-818d-5d4fd5c1a725\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297590 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d3bb0d67-7131-40e1-818d-5d4fd5c1a725" (UID: "d3bb0d67-7131-40e1-818d-5d4fd5c1a725"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.297811 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock" (OuterVolumeSpecName: "var-lock") pod "d3bb0d67-7131-40e1-818d-5d4fd5c1a725" (UID: "d3bb0d67-7131-40e1-818d-5d4fd5c1a725"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.303087 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d3bb0d67-7131-40e1-818d-5d4fd5c1a725" (UID: "d3bb0d67-7131-40e1-818d-5d4fd5c1a725"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.383404 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.384183 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.384755 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385053 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385202 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385351 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.385486 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398328 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398360 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398376 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398465 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398502 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398524 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398594 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398604 4720 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398613 4720 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398621 4720 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398629 4720 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.398637 4720 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb0d67-7131-40e1-818d-5d4fd5c1a725-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.921687 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerStarted","Data":"a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae"} Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922594 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922843 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922946 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d3bb0d67-7131-40e1-818d-5d4fd5c1a725","Type":"ContainerDied","Data":"ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a"} Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922965 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.922973 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca5672ad37f1aef198f44d464304b97b303638dc1b1a0c650de2bde69ba4a59a" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.923111 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.923646 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.924377 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.924754 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.925352 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.926794 4720 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" exitCode=0 Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.926864 4720 scope.go:117] "RemoveContainer" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.926926 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.928689 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd"} Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.929563 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.929881 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.930100 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.930300 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.930515 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.931041 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.946334 4720 scope.go:117] "RemoveContainer" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.950322 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.951366 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.951622 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.951924 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.952247 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.952494 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.957436 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958038 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958184 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958356 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958537 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.958774 4720 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.964097 4720 scope.go:117] "RemoveContainer" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.980863 4720 scope.go:117] "RemoveContainer" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" Jan 21 14:31:57 crc kubenswrapper[4720]: I0121 14:31:57.998924 4720 scope.go:117] "RemoveContainer" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.017165 4720 scope.go:117] "RemoveContainer" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.053212 4720 scope.go:117] "RemoveContainer" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.053826 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\": container with ID starting with 4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1 not found: ID does not exist" containerID="4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.053864 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1"} err="failed to get container status \"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\": rpc error: code = NotFound desc = could not find container \"4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1\": container with ID starting with 4a142cc64cbb49f253527712c0d0f4fb3537591d888594bcc703d033b47f67b1 not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.053891 4720 scope.go:117] "RemoveContainer" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.054257 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\": container with ID starting with 07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179 not found: ID does not exist" containerID="07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054282 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179"} err="failed to get container status \"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\": rpc error: code = NotFound desc = could not find container \"07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179\": container with ID starting with 07909e8fcad66d4eb5e5d5970161495fba15d03881bec826cc68c42b6e39a179 not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054299 4720 scope.go:117] "RemoveContainer" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.054635 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\": container with ID starting with 8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2 not found: ID does not exist" containerID="8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054683 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2"} err="failed to get container status \"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\": rpc error: code = NotFound desc = could not find container \"8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2\": container with ID starting with 8fa84d74fb16eedfc1ec6d2bbf04ec9d74b586ed9aa369ad2f15938af4d956c2 not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054703 4720 scope.go:117] "RemoveContainer" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.054961 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\": container with ID starting with c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f not found: ID does not exist" containerID="c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.054990 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f"} err="failed to get container status \"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\": rpc error: code = NotFound desc = could not find container \"c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f\": container with ID starting with c0f82adfc2ca5b7537bfd8bdf7682f4ec9c8303dcf4e2f9cf82d3523d2d2f26f not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055007 4720 scope.go:117] "RemoveContainer" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.055372 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\": container with ID starting with 696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d not found: ID does not exist" containerID="696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055396 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d"} err="failed to get container status \"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\": rpc error: code = NotFound desc = could not find container \"696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d\": container with ID starting with 696c35d112b31d6eeef69581faf96012141e7e530d66d1fc6e359dbaa43d115d not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055412 4720 scope.go:117] "RemoveContainer" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.055706 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\": container with ID starting with 96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b not found: ID does not exist" containerID="96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.055728 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b"} err="failed to get container status \"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\": rpc error: code = NotFound desc = could not find container \"96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b\": container with ID starting with 96f7bd824a0ba4fe233e4164f18e1cc05407ff39d941f6aad854f7e7995f9f0b not found: ID does not exist" Jan 21 14:31:58 crc kubenswrapper[4720]: I0121 14:31:58.683982 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 21 14:31:58 crc kubenswrapper[4720]: E0121 14:31:58.806112 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="3.2s" Jan 21 14:31:59 crc kubenswrapper[4720]: I0121 14:31:59.296529 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:31:59 crc kubenswrapper[4720]: I0121 14:31:59.296855 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:31:59 crc kubenswrapper[4720]: E0121 14:31:59.885036 4720 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.103:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-52n8k.188cc5822f10d1b9 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-52n8k,UID:306f9668-a044-448f-a14f-81c9726d3008,APIVersion:v1,ResourceVersion:28349,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 13.321s (13.321s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,LastTimestamp:2026-01-21 14:31:56.575814073 +0000 UTC m=+154.484554005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 21 14:32:00 crc kubenswrapper[4720]: I0121 14:32:00.333411 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" probeResult="failure" output=< Jan 21 14:32:00 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:32:00 crc kubenswrapper[4720]: > Jan 21 14:32:02 crc kubenswrapper[4720]: E0121 14:32:02.007451 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="6.4s" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.683463 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.684154 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.684553 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.684838 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:02 crc kubenswrapper[4720]: I0121 14:32:02.685259 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.215757 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.216096 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.259126 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.259908 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.260436 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.260761 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.260976 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.261205 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.332289 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.332390 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.375583 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376098 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376309 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376445 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376580 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.376759 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.677826 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.678543 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.679707 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.692930 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.693482 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.693948 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.709872 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.709902 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:06 crc kubenswrapper[4720]: E0121 14:32:06.710230 4720 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:06 crc kubenswrapper[4720]: I0121 14:32:06.710554 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.162132 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2bc5237c8b9ecf69a56904a38086b1b556b9da46c038ef02ceb834c19e501708"} Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.208257 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.208752 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.209172 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.209644 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.209922 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.210141 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.212053 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.212544 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.212882 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.213113 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.213317 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.213576 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.888334 4720 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Jan 21 14:32:07 crc kubenswrapper[4720]: I0121 14:32:07.888413 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169219 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169261 4720 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579" exitCode=1 Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169304 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579"} Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.169710 4720 scope.go:117] "RemoveContainer" containerID="d0bce3b2637abe56352f67fde03ad8f25f3e40b810255de8fe4eb60361bae579" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.170384 4720 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.170553 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.170733 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.171754 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.172118 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.172467 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.172541 4720 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="c299bdc80035bd628c26aad417556c217ec2e0c882354e8f0927d781424a2196" exitCode=0 Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173120 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173138 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"c299bdc80035bd628c26aad417556c217ec2e0c882354e8f0927d781424a2196"} Jan 21 14:32:08 crc kubenswrapper[4720]: E0121 14:32:08.173524 4720 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.173942 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174089 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174238 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174578 4720 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.174842 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.175025 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:08 crc kubenswrapper[4720]: E0121 14:32:08.408788 4720 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.103:6443: connect: connection refused" interval="7s" Jan 21 14:32:08 crc kubenswrapper[4720]: I0121 14:32:08.662553 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.183720 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.184066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9be7bdfb6abd673de4e55113d4c6827bef3eaf0bc98e1aafa71d0bf69cfb4526"} Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.184817 4720 status_manager.go:851] "Failed to get status for pod" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" pod="openshift-marketplace/certified-operators-lt46m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-lt46m\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.185235 4720 status_manager.go:851] "Failed to get status for pod" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" pod="openshift-marketplace/community-operators-5qbdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-5qbdf\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.185741 4720 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.186033 4720 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.186293 4720 status_manager.go:851] "Failed to get status for pod" podUID="306f9668-a044-448f-a14f-81c9726d3008" pod="openshift-marketplace/redhat-operators-52n8k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-52n8k\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.186592 4720 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.103:6443: connect: connection refused" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.188910 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9ddb8424839ea7398efb0ec5d1f5b0d9464d767a236fcee8bcdabe25128cb3de"} Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.341904 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:32:09 crc kubenswrapper[4720]: I0121 14:32:09.375775 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197763 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fea1e70658cfd5de4e665e701f39c9e28f5c14cdf428402c4f50ab922342a7d5"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197807 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8a823adc71da48a678430092ff973edc582150c83e38d1b2d2f0309cb8dab87a"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197817 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77de9f2a568da0068f1fc67a9d4b39b635594a170420f55df3e0125d9cf9b995"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.197826 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bf448f72444848b0e0fbd30a6237824707794428462d9c911c90f0d33ac8da61"} Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.198134 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:10 crc kubenswrapper[4720]: I0121 14:32:10.198148 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.711668 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.711741 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.716180 4720 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]log ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]etcd ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/generic-apiserver-start-informers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/priority-and-fairness-filter ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-apiextensions-informers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-apiextensions-controllers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/crd-informer-synced ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-system-namespaces-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 21 14:32:11 crc kubenswrapper[4720]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/bootstrap-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/start-kube-aggregator-informers ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-registration-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-discovery-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]autoregister-completion ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-openapi-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 21 14:32:11 crc kubenswrapper[4720]: livez check failed Jan 21 14:32:11 crc kubenswrapper[4720]: I0121 14:32:11.716220 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 21 14:32:12 crc kubenswrapper[4720]: I0121 14:32:12.746157 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:12 crc kubenswrapper[4720]: I0121 14:32:12.751668 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:13 crc kubenswrapper[4720]: I0121 14:32:13.211170 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:14 crc kubenswrapper[4720]: I0121 14:32:14.197346 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" containerID="cri-o://54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8" gracePeriod=15 Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.221264 4720 generic.go:334] "Generic (PLEG): container finished" podID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerID="54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8" exitCode=0 Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.221365 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerDied","Data":"54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8"} Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.756641 4720 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.758092 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837463 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837524 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837589 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837618 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837668 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837708 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837738 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837764 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837780 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837801 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837821 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837836 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837850 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.837866 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") pod \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\" (UID: \"45b6b4eb-147f-485e-96e1-5b08ee85ee9f\") " Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.838343 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.838966 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.839027 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.839248 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.839608 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.855706 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.856155 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2" (OuterVolumeSpecName: "kube-api-access-p8bf2") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "kube-api-access-p8bf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.858932 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.859133 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.860859 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.861101 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.867097 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.867623 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.867801 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "45b6b4eb-147f-485e-96e1-5b08ee85ee9f" (UID: "45b6b4eb-147f-485e-96e1-5b08ee85ee9f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939681 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939713 4720 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939731 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939742 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8bf2\" (UniqueName: \"kubernetes.io/projected/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-kube-api-access-p8bf2\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939753 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939762 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939770 4720 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939780 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939789 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939798 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939806 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939816 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939825 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:15 crc kubenswrapper[4720]: I0121 14:32:15.939834 4720 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/45b6b4eb-147f-485e-96e1-5b08ee85ee9f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.005792 4720 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80054503-2592-4091-a856-52e34e3cdb2b" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.235413 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.235447 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.235586 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.241896 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7xcc8" event={"ID":"45b6b4eb-147f-485e-96e1-5b08ee85ee9f","Type":"ContainerDied","Data":"30cad834f566f85c0f3a6de4d149c40b4e51c114cf6d66d633ef1b6be4e13903"} Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.241960 4720 scope.go:117] "RemoveContainer" containerID="54ba856e17b73ebe5f3f820b898a179502c0c1d8b3de3c4e102633ebd6d04fe8" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.242131 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:16 crc kubenswrapper[4720]: I0121 14:32:16.257875 4720 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80054503-2592-4091-a856-52e34e3cdb2b" Jan 21 14:32:16 crc kubenswrapper[4720]: E0121 14:32:16.455909 4720 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Jan 21 14:32:16 crc kubenswrapper[4720]: E0121 14:32:16.686324 4720 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Jan 21 14:32:16 crc kubenswrapper[4720]: E0121 14:32:16.824707 4720 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Jan 21 14:32:17 crc kubenswrapper[4720]: I0121 14:32:17.245605 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:17 crc kubenswrapper[4720]: I0121 14:32:17.245702 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:17 crc kubenswrapper[4720]: I0121 14:32:17.249190 4720 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="80054503-2592-4091-a856-52e34e3cdb2b" Jan 21 14:32:22 crc kubenswrapper[4720]: I0121 14:32:22.880119 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:32:22 crc kubenswrapper[4720]: I0121 14:32:22.880727 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:32:25 crc kubenswrapper[4720]: I0121 14:32:25.863528 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 21 14:32:26 crc kubenswrapper[4720]: I0121 14:32:26.568751 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 21 14:32:26 crc kubenswrapper[4720]: I0121 14:32:26.815832 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 21 14:32:26 crc kubenswrapper[4720]: I0121 14:32:26.853556 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.065639 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.087728 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.102212 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.894760 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 21 14:32:27 crc kubenswrapper[4720]: I0121 14:32:27.899158 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.000108 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.122114 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.131389 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.144171 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.211341 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.424133 4720 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.470368 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.565133 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.617614 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.747560 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.788819 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 21 14:32:28 crc kubenswrapper[4720]: I0121 14:32:28.791558 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.133513 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.183327 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.267452 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.358767 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.496447 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.519834 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.523994 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.549345 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.624432 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.760815 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.764417 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.838187 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.917551 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 21 14:32:29 crc kubenswrapper[4720]: I0121 14:32:29.959684 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.043920 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.251810 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.290044 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.582001 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.642419 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.674947 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.694244 4720 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.785899 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.830793 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.853178 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.981021 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.981347 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.991857 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 21 14:32:30 crc kubenswrapper[4720]: I0121 14:32:30.995018 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.087127 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.209693 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.224213 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.271222 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.275161 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.305851 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.398019 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.405091 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.433682 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.470874 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.560927 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.778408 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.794909 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.860632 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.888741 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 21 14:32:31 crc kubenswrapper[4720]: I0121 14:32:31.967457 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.034539 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.114014 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.257393 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.328459 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.375403 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.454571 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.473061 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.505120 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.607305 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.637329 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.645150 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.677003 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.697312 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.697413 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.699019 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.699132 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.721083 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.723059 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.818366 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.960915 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 21 14:32:32 crc kubenswrapper[4720]: I0121 14:32:32.990140 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.001677 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.213035 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.234160 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.242825 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.273820 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.297436 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.319602 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.345094 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.396812 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.400420 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.456639 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.541022 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.542378 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.566133 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.590249 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.603289 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.868493 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.941221 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.986486 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 21 14:32:33 crc kubenswrapper[4720]: I0121 14:32:33.986491 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.087384 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.140894 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.179291 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.188835 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.211834 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.265944 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.416851 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.449060 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.605935 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.612615 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.617549 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.620105 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.627254 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.695588 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.695779 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.846153 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.879115 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.886352 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 21 14:32:34 crc kubenswrapper[4720]: I0121 14:32:34.895145 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.048624 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.065870 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.082336 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.209413 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.249436 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.288777 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.321043 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.380727 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.414880 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.438004 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.457884 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.495536 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.559150 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.563636 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.573910 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.592515 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.597952 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.615251 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.709801 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.745648 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.847759 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.852543 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.871218 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 21 14:32:35 crc kubenswrapper[4720]: I0121 14:32:35.880184 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.000145 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.013591 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.067183 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.206631 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.256695 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.303588 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.364500 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.448914 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.600499 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.604322 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.675579 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.738319 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.763506 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.817598 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.894265 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.936694 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.972125 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 21 14:32:36 crc kubenswrapper[4720]: I0121 14:32:36.994571 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.096920 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.099542 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.155886 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.192177 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.224987 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.296079 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.322643 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.435816 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.510235 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.568631 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.576878 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.632323 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.639289 4720 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.705072 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.774215 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 21 14:32:37 crc kubenswrapper[4720]: I0121 14:32:37.930770 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.024370 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.043207 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.157383 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.236021 4720 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.289898 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.401218 4720 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.450868 4720 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.451865 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-52n8k" podStartSLOduration=44.920246146 podStartE2EDuration="2m30.451845084s" podCreationTimestamp="2026-01-21 14:30:08 +0000 UTC" firstStartedPulling="2026-01-21 14:30:11.044201955 +0000 UTC m=+48.952941897" lastFinishedPulling="2026-01-21 14:31:56.575800893 +0000 UTC m=+154.484540835" observedRunningTime="2026-01-21 14:32:15.78413687 +0000 UTC m=+173.692876802" watchObservedRunningTime="2026-01-21 14:32:38.451845084 +0000 UTC m=+196.360585016" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.452224 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lt46m" podStartSLOduration=57.370809771 podStartE2EDuration="2m33.452218115s" podCreationTimestamp="2026-01-21 14:30:05 +0000 UTC" firstStartedPulling="2026-01-21 14:30:09.995841065 +0000 UTC m=+47.904580997" lastFinishedPulling="2026-01-21 14:31:46.077249409 +0000 UTC m=+143.985989341" observedRunningTime="2026-01-21 14:32:15.887787625 +0000 UTC m=+173.796527587" watchObservedRunningTime="2026-01-21 14:32:38.452218115 +0000 UTC m=+196.360958047" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.453125 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5qbdf" podStartSLOduration=46.828159718 podStartE2EDuration="2m33.453118985s" podCreationTimestamp="2026-01-21 14:30:05 +0000 UTC" firstStartedPulling="2026-01-21 14:30:10.009336254 +0000 UTC m=+47.918076186" lastFinishedPulling="2026-01-21 14:31:56.634295501 +0000 UTC m=+154.543035453" observedRunningTime="2026-01-21 14:32:15.849381305 +0000 UTC m=+173.758121237" watchObservedRunningTime="2026-01-21 14:32:38.453118985 +0000 UTC m=+196.361858927" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.454057 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.454052075 podStartE2EDuration="44.454052075s" podCreationTimestamp="2026-01-21 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:32:15.905414693 +0000 UTC m=+173.814154625" watchObservedRunningTime="2026-01-21 14:32:38.454052075 +0000 UTC m=+196.362792007" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.455783 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7xcc8","openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.455911 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7865b47677-vf9fw","openshift-kube-apiserver/kube-apiserver-crc"] Jan 21 14:32:38 crc kubenswrapper[4720]: E0121 14:32:38.456180 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerName="installer" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456279 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerName="installer" Jan 21 14:32:38 crc kubenswrapper[4720]: E0121 14:32:38.456375 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456445 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456298 4720 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456691 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="41e4ae1e-77b8-40b8-9f64-1eba5a39188a" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456611 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" containerName="oauth-openshift" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.456854 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bb0d67-7131-40e1-818d-5d4fd5c1a725" containerName="installer" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.457639 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.461226 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.461573 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462337 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462400 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462561 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.462780 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.465557 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.465875 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.466851 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.466949 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.467521 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.467723 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.468066 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.479849 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.481529 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.485612 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.494338 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.494309304 podStartE2EDuration="23.494309304s" podCreationTimestamp="2026-01-21 14:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:32:38.486183232 +0000 UTC m=+196.394923214" watchObservedRunningTime="2026-01-21 14:32:38.494309304 +0000 UTC m=+196.403049276" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.494603 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.589307 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.589944 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622797 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622863 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-dir\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622882 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622906 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-session\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622941 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.622979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623002 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623026 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623051 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623122 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67zx8\" (UniqueName: \"kubernetes.io/projected/5aad29d4-274a-49b7-8b0b-8c4c496206fc-kube-api-access-67zx8\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623174 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623203 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623220 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-policies\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.623241 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.646409 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.688335 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b6b4eb-147f-485e-96e1-5b08ee85ee9f" path="/var/lib/kubelet/pods/45b6b4eb-147f-485e-96e1-5b08ee85ee9f/volumes" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-session\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724326 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724365 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724398 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724445 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724499 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724534 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67zx8\" (UniqueName: \"kubernetes.io/projected/5aad29d4-274a-49b7-8b0b-8c4c496206fc-kube-api-access-67zx8\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724569 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724639 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724710 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-policies\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724847 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724886 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.724928 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-dir\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.725091 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-dir\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.726491 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.729476 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.730741 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731027 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-error\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731303 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-session\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731381 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-audit-policies\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.731493 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-service-ca\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.732132 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-router-certs\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.734948 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-login\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.735350 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.735837 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.736021 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.738521 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5aad29d4-274a-49b7-8b0b-8c4c496206fc-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.749630 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67zx8\" (UniqueName: \"kubernetes.io/projected/5aad29d4-274a-49b7-8b0b-8c4c496206fc-kube-api-access-67zx8\") pod \"oauth-openshift-7865b47677-vf9fw\" (UID: \"5aad29d4-274a-49b7-8b0b-8c4c496206fc\") " pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.784514 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.797910 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.820640 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.886488 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 21 14:32:38 crc kubenswrapper[4720]: I0121 14:32:38.926809 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.046624 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.056113 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.058623 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.161262 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.162085 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.194087 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7865b47677-vf9fw"] Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.333288 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.367204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" event={"ID":"5aad29d4-274a-49b7-8b0b-8c4c496206fc","Type":"ContainerStarted","Data":"87bd5a3fbb62e9dc1c9aaf1dcf20a5689d97d9dc8eff61c3a66f8ec13a46cce8"} Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.459749 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.490391 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.573823 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.596234 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.674748 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.774975 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.780998 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.914298 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 21 14:32:39 crc kubenswrapper[4720]: I0121 14:32:39.962765 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.156200 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.168085 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.221208 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.226265 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.356886 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.375229 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" event={"ID":"5aad29d4-274a-49b7-8b0b-8c4c496206fc","Type":"ContainerStarted","Data":"b6c02fa1e53e3273b5182983a84ca77bb1f0c78af6ab01fd4f05e2d46bba0d8b"} Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.375832 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.384277 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.400775 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7865b47677-vf9fw" podStartSLOduration=51.400752335 podStartE2EDuration="51.400752335s" podCreationTimestamp="2026-01-21 14:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:32:40.399497104 +0000 UTC m=+198.308237076" watchObservedRunningTime="2026-01-21 14:32:40.400752335 +0000 UTC m=+198.309492297" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.481486 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.540865 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.641235 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.743154 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.753738 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.827802 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.932770 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:32:40 crc kubenswrapper[4720]: I0121 14:32:40.973328 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.264336 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.290645 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.715137 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.719516 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.797330 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.813865 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.902994 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 21 14:32:41 crc kubenswrapper[4720]: I0121 14:32:41.995562 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.094519 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.095861 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.536848 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 21 14:32:42 crc kubenswrapper[4720]: I0121 14:32:42.662726 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 21 14:32:43 crc kubenswrapper[4720]: I0121 14:32:43.146139 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 21 14:32:48 crc kubenswrapper[4720]: I0121 14:32:48.600099 4720 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:32:48 crc kubenswrapper[4720]: I0121 14:32:48.600607 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" gracePeriod=5 Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.879616 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880251 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880299 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880870 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:32:52 crc kubenswrapper[4720]: I0121 14:32:52.880960 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240" gracePeriod=600 Jan 21 14:32:53 crc kubenswrapper[4720]: I0121 14:32:53.446984 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240" exitCode=0 Jan 21 14:32:53 crc kubenswrapper[4720]: I0121 14:32:53.447058 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240"} Jan 21 14:32:53 crc kubenswrapper[4720]: I0121 14:32:53.447244 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1"} Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.163903 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.164419 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224752 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224853 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224893 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224914 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224908 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224944 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224988 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.224986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225060 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225158 4720 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225479 4720 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.225508 4720 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.236250 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.326883 4720 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.326918 4720 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456157 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456213 4720 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" exitCode=137 Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456266 4720 scope.go:117] "RemoveContainer" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.456337 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.476152 4720 scope.go:117] "RemoveContainer" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" Jan 21 14:32:54 crc kubenswrapper[4720]: E0121 14:32:54.476663 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd\": container with ID starting with f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd not found: ID does not exist" containerID="f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.476735 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd"} err="failed to get container status \"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd\": rpc error: code = NotFound desc = could not find container \"f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd\": container with ID starting with f212557e6523349c38a5f3904c46244ca2cce5c140b79e09c813b80cd762b5cd not found: ID does not exist" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.686986 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.687904 4720 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.699033 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.699072 4720 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5fecc3c3-217b-4d49-a894-931812b93b05" Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.701547 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 21 14:32:54 crc kubenswrapper[4720]: I0121 14:32:54.701582 4720 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5fecc3c3-217b-4d49-a894-931812b93b05" Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.506305 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.507001 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" containerID="cri-o://566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" gracePeriod=30 Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.603994 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.604218 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" containerID="cri-o://aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" gracePeriod=30 Jan 21 14:33:11 crc kubenswrapper[4720]: I0121 14:33:11.936412 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007688 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007744 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007770 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007837 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.007862 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") pod \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\" (UID: \"03eab9ba-e390-43a8-ab91-b8f0fe8678a0\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.008610 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.008633 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config" (OuterVolumeSpecName: "config") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.008683 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.013406 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9" (OuterVolumeSpecName: "kube-api-access-jj8k9") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "kube-api-access-jj8k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.013805 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "03eab9ba-e390-43a8-ab91-b8f0fe8678a0" (UID: "03eab9ba-e390-43a8-ab91-b8f0fe8678a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109485 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109520 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109536 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109546 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.109560 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj8k9\" (UniqueName: \"kubernetes.io/projected/03eab9ba-e390-43a8-ab91-b8f0fe8678a0-kube-api-access-jj8k9\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.437109 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514694 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514811 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514903 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.514976 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") pod \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\" (UID: \"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8\") " Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.516000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config" (OuterVolumeSpecName: "config") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.516018 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca" (OuterVolumeSpecName: "client-ca") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.519962 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.520300 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q" (OuterVolumeSpecName: "kube-api-access-6tf9q") pod "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" (UID: "e977ca1c-c59f-4c61-8e47-4d03d3b0ced8"). InnerVolumeSpecName "kube-api-access-6tf9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540175 4720 generic.go:334] "Generic (PLEG): container finished" podID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" exitCode=0 Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540238 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerDied","Data":"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540242 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540267 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9gr25" event={"ID":"03eab9ba-e390-43a8-ab91-b8f0fe8678a0","Type":"ContainerDied","Data":"1a03a4355bd12eae90e463960102d7b8d0f28a5a014b426c9235206feb008d3a"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.540282 4720 scope.go:117] "RemoveContainer" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544230 4720 generic.go:334] "Generic (PLEG): container finished" podID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" exitCode=0 Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544278 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerDied","Data":"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544283 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.544308 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk" event={"ID":"e977ca1c-c59f-4c61-8e47-4d03d3b0ced8","Type":"ContainerDied","Data":"830f00cd4952a252732ae85fe73bd3c43f95902077b3e9a257094be91b79359d"} Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.557879 4720 scope.go:117] "RemoveContainer" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" Jan 21 14:33:12 crc kubenswrapper[4720]: E0121 14:33:12.559014 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071\": container with ID starting with 566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071 not found: ID does not exist" containerID="566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.559050 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071"} err="failed to get container status \"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071\": rpc error: code = NotFound desc = could not find container \"566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071\": container with ID starting with 566a6d1a5d22263d021363b64e42596b81bd2b7c56316df45c97fc2d4dc05071 not found: ID does not exist" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.559095 4720 scope.go:117] "RemoveContainer" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.575317 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.579977 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9gr25"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.587123 4720 scope.go:117] "RemoveContainer" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" Jan 21 14:33:12 crc kubenswrapper[4720]: E0121 14:33:12.587586 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4\": container with ID starting with aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4 not found: ID does not exist" containerID="aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.587673 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4"} err="failed to get container status \"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4\": rpc error: code = NotFound desc = could not find container \"aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4\": container with ID starting with aa3faf8a1f06699eadc416f2c819035756683b2951639a26a35299fd3eb1eeb4 not found: ID does not exist" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.589053 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.591705 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bncrk"] Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616821 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tf9q\" (UniqueName: \"kubernetes.io/projected/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-kube-api-access-6tf9q\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616856 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616870 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.616885 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.688716 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" path="/var/lib/kubelet/pods/03eab9ba-e390-43a8-ab91-b8f0fe8678a0/volumes" Jan 21 14:33:12 crc kubenswrapper[4720]: I0121 14:33:12.689277 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" path="/var/lib/kubelet/pods/e977ca1c-c59f-4c61-8e47-4d03d3b0ced8/volumes" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.170799 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.171052 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171082 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.171096 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171102 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.171112 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171118 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171224 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="03eab9ba-e390-43a8-ab91-b8f0fe8678a0" containerName="controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171233 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="e977ca1c-c59f-4c61-8e47-4d03d3b0ced8" containerName="route-controller-manager" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.171239 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.173385 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.180720 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181064 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181294 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181460 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.181677 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.182040 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.186694 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.187612 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.191010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.191365 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.191703 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.194886 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.195672 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.200395 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.204172 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.208045 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.232225 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328786 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328865 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.328889 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329025 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329116 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329199 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329226 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.329267 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430270 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430327 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430364 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430398 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430431 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430508 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430554 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430580 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.430608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432253 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432284 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432643 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.432918 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.432968 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config kube-api-access-gchcn serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" podUID="4113d218-6a4e-419e-88a3-9f6f22a8dbd5" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.433190 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.436198 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.436942 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.438681 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.454687 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"controller-manager-86cb48757f-q9mqs\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.459552 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:13 crc kubenswrapper[4720]: E0121 14:33:13.460024 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9h4nj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" podUID="cc7f2a3c-bc80-48fe-b417-01789a08fc5f" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.473357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"route-controller-manager-5c6d7f99d9-jgz8m\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.550225 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.550225 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.558759 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.564446 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.733933 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734025 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734050 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734075 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734103 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734149 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734183 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") pod \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\" (UID: \"4113d218-6a4e-419e-88a3-9f6f22a8dbd5\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734206 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") pod \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\" (UID: \"cc7f2a3c-bc80-48fe-b417-01789a08fc5f\") " Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734345 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca" (OuterVolumeSpecName: "client-ca") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734792 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config" (OuterVolumeSpecName: "config") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734796 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config" (OuterVolumeSpecName: "config") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.734880 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.735034 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca" (OuterVolumeSpecName: "client-ca") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.737449 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn" (OuterVolumeSpecName: "kube-api-access-gchcn") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "kube-api-access-gchcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.738194 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4113d218-6a4e-419e-88a3-9f6f22a8dbd5" (UID: "4113d218-6a4e-419e-88a3-9f6f22a8dbd5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.738754 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.739755 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj" (OuterVolumeSpecName: "kube-api-access-9h4nj") pod "cc7f2a3c-bc80-48fe-b417-01789a08fc5f" (UID: "cc7f2a3c-bc80-48fe-b417-01789a08fc5f"). InnerVolumeSpecName "kube-api-access-9h4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835686 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h4nj\" (UniqueName: \"kubernetes.io/projected/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-kube-api-access-9h4nj\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835724 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gchcn\" (UniqueName: \"kubernetes.io/projected/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-kube-api-access-gchcn\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835736 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835750 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835761 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835775 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835785 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835797 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cc7f2a3c-bc80-48fe-b417-01789a08fc5f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:13 crc kubenswrapper[4720]: I0121 14:33:13.835807 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4113d218-6a4e-419e-88a3-9f6f22a8dbd5-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.554995 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86cb48757f-q9mqs" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.555010 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.599870 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.609817 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86cb48757f-q9mqs"] Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.677125 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.685553 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4113d218-6a4e-419e-88a3-9f6f22a8dbd5" path="/var/lib/kubelet/pods/4113d218-6a4e-419e-88a3-9f6f22a8dbd5/volumes" Jan 21 14:33:14 crc kubenswrapper[4720]: I0121 14:33:14.686904 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c6d7f99d9-jgz8m"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.170239 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.171133 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.174511 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.174827 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.174978 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.175859 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.175907 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.177074 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.177548 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.178571 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.184353 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.184494 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.186505 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.186725 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.186889 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.187098 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.198845 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.202982 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.203728 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.251402 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.251742 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.251878 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.252027 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.252122 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.353598 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354057 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354387 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354687 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354810 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.354959 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355056 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355189 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.355483 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.356355 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.357683 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.378204 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"controller-manager-54464559b6-jzh6z\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456457 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456516 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456552 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.456586 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.457377 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.458030 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.459583 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.473591 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"route-controller-manager-5cf5674648-vlcxj\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.496584 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.510060 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.666977 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.667429 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" containerID="cri-o://a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" gracePeriod=2 Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.712562 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:33:15 crc kubenswrapper[4720]: I0121 14:33:15.753255 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:15 crc kubenswrapper[4720]: W0121 14:33:15.754307 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d75fb57_7a86_4641_8f13_4cbcae180901.slice/crio-a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b WatchSource:0}: Error finding container a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b: Status 404 returned error can't find the container with id a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.332767 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.333354 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.333742 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" cmd=["grpc_health_probe","-addr=:50051"] Jan 21 14:33:16 crc kubenswrapper[4720]: E0121 14:33:16.333789 4720 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-lt46m" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.570838 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerStarted","Data":"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.570891 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerStarted","Data":"a827d68c41cf6bca1d1353db6d4c691cd0bbcd9fa7fef0db59ccff42a67e61f8"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.571063 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574198 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" exitCode=0 Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574278 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lt46m" event={"ID":"7bb4c793-0d05-43f9-a9ad-30d9b6b40595","Type":"ContainerDied","Data":"328b3e95ade1caeae4e693dd7d243f33f61953dabc84aa7d096915ec1cb9417f"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.574312 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="328b3e95ade1caeae4e693dd7d243f33f61953dabc84aa7d096915ec1cb9417f" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.575765 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerStarted","Data":"e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.575790 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerStarted","Data":"a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b"} Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.577303 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.583075 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.588541 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.603422 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" podStartSLOduration=3.60340771 podStartE2EDuration="3.60340771s" podCreationTimestamp="2026-01-21 14:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:16.600095106 +0000 UTC m=+234.508835048" watchObservedRunningTime="2026-01-21 14:33:16.60340771 +0000 UTC m=+234.512147632" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.608147 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.636630 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" podStartSLOduration=3.636409758 podStartE2EDuration="3.636409758s" podCreationTimestamp="2026-01-21 14:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:16.631340932 +0000 UTC m=+234.540080864" watchObservedRunningTime="2026-01-21 14:33:16.636409758 +0000 UTC m=+234.545149710" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.684454 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc7f2a3c-bc80-48fe-b417-01789a08fc5f" path="/var/lib/kubelet/pods/cc7f2a3c-bc80-48fe-b417-01789a08fc5f/volumes" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.772777 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") pod \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.772844 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") pod \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.772907 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") pod \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\" (UID: \"7bb4c793-0d05-43f9-a9ad-30d9b6b40595\") " Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.773793 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities" (OuterVolumeSpecName: "utilities") pod "7bb4c793-0d05-43f9-a9ad-30d9b6b40595" (UID: "7bb4c793-0d05-43f9-a9ad-30d9b6b40595"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.791080 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp" (OuterVolumeSpecName: "kube-api-access-kxcbp") pod "7bb4c793-0d05-43f9-a9ad-30d9b6b40595" (UID: "7bb4c793-0d05-43f9-a9ad-30d9b6b40595"). InnerVolumeSpecName "kube-api-access-kxcbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.864885 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bb4c793-0d05-43f9-a9ad-30d9b6b40595" (UID: "7bb4c793-0d05-43f9-a9ad-30d9b6b40595"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.877508 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.877534 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxcbp\" (UniqueName: \"kubernetes.io/projected/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-kube-api-access-kxcbp\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:16 crc kubenswrapper[4720]: I0121 14:33:16.877544 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb4c793-0d05-43f9-a9ad-30d9b6b40595-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:17 crc kubenswrapper[4720]: I0121 14:33:17.581559 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lt46m" Jan 21 14:33:17 crc kubenswrapper[4720]: I0121 14:33:17.626286 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:33:17 crc kubenswrapper[4720]: I0121 14:33:17.643933 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lt46m"] Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.261881 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.262177 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-52n8k" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" containerID="cri-o://9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3" gracePeriod=2 Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.587123 4720 generic.go:334] "Generic (PLEG): container finished" podID="306f9668-a044-448f-a14f-81c9726d3008" containerID="9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3" exitCode=0 Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.588082 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3"} Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.630224 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.684151 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" path="/var/lib/kubelet/pods/7bb4c793-0d05-43f9-a9ad-30d9b6b40595/volumes" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.802416 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") pod \"306f9668-a044-448f-a14f-81c9726d3008\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.802543 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") pod \"306f9668-a044-448f-a14f-81c9726d3008\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.802561 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") pod \"306f9668-a044-448f-a14f-81c9726d3008\" (UID: \"306f9668-a044-448f-a14f-81c9726d3008\") " Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.803487 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities" (OuterVolumeSpecName: "utilities") pod "306f9668-a044-448f-a14f-81c9726d3008" (UID: "306f9668-a044-448f-a14f-81c9726d3008"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.807121 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg" (OuterVolumeSpecName: "kube-api-access-t4pfg") pod "306f9668-a044-448f-a14f-81c9726d3008" (UID: "306f9668-a044-448f-a14f-81c9726d3008"). InnerVolumeSpecName "kube-api-access-t4pfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.905221 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4pfg\" (UniqueName: \"kubernetes.io/projected/306f9668-a044-448f-a14f-81c9726d3008-kube-api-access-t4pfg\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.905257 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:18 crc kubenswrapper[4720]: I0121 14:33:18.913468 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "306f9668-a044-448f-a14f-81c9726d3008" (UID: "306f9668-a044-448f-a14f-81c9726d3008"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.007221 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306f9668-a044-448f-a14f-81c9726d3008-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.596810 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-52n8k" event={"ID":"306f9668-a044-448f-a14f-81c9726d3008","Type":"ContainerDied","Data":"26891b408ccd24b0c8434d044528c04f82c156ee44333c5cc05cf38ad2ef94ce"} Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.597260 4720 scope.go:117] "RemoveContainer" containerID="9aac69c47b901a51f1f77aeb2f7ba200d24ee13fa6a5d85f7f3f5f24f22716a3" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.596888 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-52n8k" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.633093 4720 scope.go:117] "RemoveContainer" containerID="359803a342c5c510fb51706cab89d859016c20be09a4df27bb7da03e276e9272" Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.635461 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.644074 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-52n8k"] Jan 21 14:33:19 crc kubenswrapper[4720]: I0121 14:33:19.660960 4720 scope.go:117] "RemoveContainer" containerID="23db2e3dd80933444006432f7c28ae6c0623796c99b317cd90c3617bb24ec475" Jan 21 14:33:20 crc kubenswrapper[4720]: I0121 14:33:20.720056 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306f9668-a044-448f-a14f-81c9726d3008" path="/var/lib/kubelet/pods/306f9668-a044-448f-a14f-81c9726d3008/volumes" Jan 21 14:33:31 crc kubenswrapper[4720]: I0121 14:33:31.535165 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:31 crc kubenswrapper[4720]: I0121 14:33:31.535812 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" containerID="cri-o://e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2" gracePeriod=30 Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.673188 4720 generic.go:334] "Generic (PLEG): container finished" podID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerID="e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2" exitCode=0 Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.673492 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerDied","Data":"e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2"} Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.976931 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983031 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983120 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983214 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983266 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983335 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") pod \"7d75fb57-7a86-4641-8f13-4cbcae180901\" (UID: \"7d75fb57-7a86-4641-8f13-4cbcae180901\") " Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.983879 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.984301 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config" (OuterVolumeSpecName: "config") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.984558 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.992537 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj" (OuterVolumeSpecName: "kube-api-access-pghmj") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "kube-api-access-pghmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:32 crc kubenswrapper[4720]: I0121 14:33:32.992846 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d75fb57-7a86-4641-8f13-4cbcae180901" (UID: "7d75fb57-7a86-4641-8f13-4cbcae180901"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010096 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz"] Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010380 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010396 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010409 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010417 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010426 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010433 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010440 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010448 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010458 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010466 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="extract-utilities" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010477 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010484 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="extract-content" Jan 21 14:33:33 crc kubenswrapper[4720]: E0121 14:33:33.010496 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010502 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010647 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" containerName="controller-manager" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010684 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="306f9668-a044-448f-a14f-81c9726d3008" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.010699 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb4c793-0d05-43f9-a9ad-30d9b6b40595" containerName="registry-server" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.011162 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.037243 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz"] Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084022 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vt55\" (UniqueName: \"kubernetes.io/projected/b99b212d-c1f0-4082-a4a4-8e4b657183a9-kube-api-access-8vt55\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084075 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-proxy-ca-bundles\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084118 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-config\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084148 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-client-ca\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084168 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99b212d-c1f0-4082-a4a4-8e4b657183a9-serving-cert\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084289 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pghmj\" (UniqueName: \"kubernetes.io/projected/7d75fb57-7a86-4641-8f13-4cbcae180901-kube-api-access-pghmj\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084305 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084316 4720 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084326 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d75fb57-7a86-4641-8f13-4cbcae180901-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.084335 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d75fb57-7a86-4641-8f13-4cbcae180901-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185279 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-client-ca\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185328 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99b212d-c1f0-4082-a4a4-8e4b657183a9-serving-cert\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185424 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vt55\" (UniqueName: \"kubernetes.io/projected/b99b212d-c1f0-4082-a4a4-8e4b657183a9-kube-api-access-8vt55\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185446 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-proxy-ca-bundles\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.185468 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-config\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.186722 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-client-ca\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.187127 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-config\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.188132 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b99b212d-c1f0-4082-a4a4-8e4b657183a9-proxy-ca-bundles\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.199144 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b99b212d-c1f0-4082-a4a4-8e4b657183a9-serving-cert\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.204073 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vt55\" (UniqueName: \"kubernetes.io/projected/b99b212d-c1f0-4082-a4a4-8e4b657183a9-kube-api-access-8vt55\") pod \"controller-manager-6685c7d6d5-5tgnz\" (UID: \"b99b212d-c1f0-4082-a4a4-8e4b657183a9\") " pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.339007 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.540968 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz"] Jan 21 14:33:33 crc kubenswrapper[4720]: W0121 14:33:33.547129 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99b212d_c1f0_4082_a4a4_8e4b657183a9.slice/crio-56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747 WatchSource:0}: Error finding container 56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747: Status 404 returned error can't find the container with id 56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747 Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.681390 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" event={"ID":"7d75fb57-7a86-4641-8f13-4cbcae180901","Type":"ContainerDied","Data":"a6080bac1a145ecb3a8ac5bc7db21273027a2c001a70b4341c67dad2f7fc220b"} Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.681503 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54464559b6-jzh6z" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.681525 4720 scope.go:117] "RemoveContainer" containerID="e5225bde46f6d1977380e310105ba78dc0335e0a200232faef190ccdb9de5fb2" Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.682308 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" event={"ID":"b99b212d-c1f0-4082-a4a4-8e4b657183a9","Type":"ContainerStarted","Data":"56af3eb69da1bd135159d80389e0f98c7bf7b1c0cc0b8379625c9b2bd29d2747"} Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.723016 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:33 crc kubenswrapper[4720]: I0121 14:33:33.728498 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54464559b6-jzh6z"] Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.685132 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d75fb57-7a86-4641-8f13-4cbcae180901" path="/var/lib/kubelet/pods/7d75fb57-7a86-4641-8f13-4cbcae180901/volumes" Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.687333 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" event={"ID":"b99b212d-c1f0-4082-a4a4-8e4b657183a9","Type":"ContainerStarted","Data":"b9f898336d897d4aae76d485dc53f15efa3bbf534209d9b999afa589783f4e53"} Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.687535 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.693317 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" Jan 21 14:33:34 crc kubenswrapper[4720]: I0121 14:33:34.713044 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6685c7d6d5-5tgnz" podStartSLOduration=3.713020817 podStartE2EDuration="3.713020817s" podCreationTimestamp="2026-01-21 14:33:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:34.710259618 +0000 UTC m=+252.618999550" watchObservedRunningTime="2026-01-21 14:33:34.713020817 +0000 UTC m=+252.621760769" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.652195 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.652976 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v6vwc" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" containerID="cri-o://c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.664026 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.664453 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5qbdf" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" containerID="cri-o://37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.682853 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.687929 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" containerID="cri-o://d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.698231 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.698747 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c95rn" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" containerID="cri-o://eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.708958 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.709609 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x7575" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" containerID="cri-o://0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" gracePeriod=30 Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.715435 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9hd2"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.717950 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.723046 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9hd2"] Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.800243 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.800325 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnvw\" (UniqueName: \"kubernetes.io/projected/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-kube-api-access-llnvw\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.800368 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.900940 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnvw\" (UniqueName: \"kubernetes.io/projected/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-kube-api-access-llnvw\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.901010 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.901033 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.904505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.924480 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:41 crc kubenswrapper[4720]: I0121 14:33:41.926546 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnvw\" (UniqueName: \"kubernetes.io/projected/fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c-kube-api-access-llnvw\") pod \"marketplace-operator-79b997595-s9hd2\" (UID: \"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c\") " pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.043041 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.306603 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.315945 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") pod \"1d6131a5-b63e-42a5-905a-9ed5350a421a\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.316030 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") pod \"1d6131a5-b63e-42a5-905a-9ed5350a421a\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.316055 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") pod \"1d6131a5-b63e-42a5-905a-9ed5350a421a\" (UID: \"1d6131a5-b63e-42a5-905a-9ed5350a421a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.317091 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities" (OuterVolumeSpecName: "utilities") pod "1d6131a5-b63e-42a5-905a-9ed5350a421a" (UID: "1d6131a5-b63e-42a5-905a-9ed5350a421a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.329500 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4" (OuterVolumeSpecName: "kube-api-access-dmfz4") pod "1d6131a5-b63e-42a5-905a-9ed5350a421a" (UID: "1d6131a5-b63e-42a5-905a-9ed5350a421a"). InnerVolumeSpecName "kube-api-access-dmfz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.417400 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmfz4\" (UniqueName: \"kubernetes.io/projected/1d6131a5-b63e-42a5-905a-9ed5350a421a-kube-api-access-dmfz4\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.417431 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.426598 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d6131a5-b63e-42a5-905a-9ed5350a421a" (UID: "1d6131a5-b63e-42a5-905a-9ed5350a421a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.445835 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.460133 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.473947 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518477 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") pod \"328ecaa4-59eb-4707-a320-245636d0c778\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518537 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") pod \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518556 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") pod \"328ecaa4-59eb-4707-a320-245636d0c778\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518584 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") pod \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518618 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") pod \"4bbb0e48-d287-42fc-a165-86038d2083c9\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518677 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") pod \"4bbb0e48-d287-42fc-a165-86038d2083c9\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518735 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") pod \"328ecaa4-59eb-4707-a320-245636d0c778\" (UID: \"328ecaa4-59eb-4707-a320-245636d0c778\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518756 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") pod \"4bbb0e48-d287-42fc-a165-86038d2083c9\" (UID: \"4bbb0e48-d287-42fc-a165-86038d2083c9\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518776 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") pod \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\" (UID: \"8432f9d9-0168-4b49-b6a7-66281f46bd5a\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.518973 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d6131a5-b63e-42a5-905a-9ed5350a421a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.520268 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities" (OuterVolumeSpecName: "utilities") pod "4bbb0e48-d287-42fc-a165-86038d2083c9" (UID: "4bbb0e48-d287-42fc-a165-86038d2083c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.520742 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities" (OuterVolumeSpecName: "utilities") pod "328ecaa4-59eb-4707-a320-245636d0c778" (UID: "328ecaa4-59eb-4707-a320-245636d0c778"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.520823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities" (OuterVolumeSpecName: "utilities") pod "8432f9d9-0168-4b49-b6a7-66281f46bd5a" (UID: "8432f9d9-0168-4b49-b6a7-66281f46bd5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.525739 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc" (OuterVolumeSpecName: "kube-api-access-mmzpc") pod "328ecaa4-59eb-4707-a320-245636d0c778" (UID: "328ecaa4-59eb-4707-a320-245636d0c778"). InnerVolumeSpecName "kube-api-access-mmzpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.530143 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg" (OuterVolumeSpecName: "kube-api-access-4szdg") pod "8432f9d9-0168-4b49-b6a7-66281f46bd5a" (UID: "8432f9d9-0168-4b49-b6a7-66281f46bd5a"). InnerVolumeSpecName "kube-api-access-4szdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.530503 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s" (OuterVolumeSpecName: "kube-api-access-sfn9s") pod "4bbb0e48-d287-42fc-a165-86038d2083c9" (UID: "4bbb0e48-d287-42fc-a165-86038d2083c9"). InnerVolumeSpecName "kube-api-access-sfn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.533921 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.550206 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8432f9d9-0168-4b49-b6a7-66281f46bd5a" (UID: "8432f9d9-0168-4b49-b6a7-66281f46bd5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621430 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") pod \"90d203a9-910b-471c-afb5-e487b65136ac\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621484 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") pod \"90d203a9-910b-471c-afb5-e487b65136ac\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621511 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") pod \"90d203a9-910b-471c-afb5-e487b65136ac\" (UID: \"90d203a9-910b-471c-afb5-e487b65136ac\") " Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621970 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.621989 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4szdg\" (UniqueName: \"kubernetes.io/projected/8432f9d9-0168-4b49-b6a7-66281f46bd5a-kube-api-access-4szdg\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622001 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622009 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622018 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfn9s\" (UniqueName: \"kubernetes.io/projected/4bbb0e48-d287-42fc-a165-86038d2083c9-kube-api-access-sfn9s\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622026 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8432f9d9-0168-4b49-b6a7-66281f46bd5a-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622035 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmzpc\" (UniqueName: \"kubernetes.io/projected/328ecaa4-59eb-4707-a320-245636d0c778-kube-api-access-mmzpc\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.622451 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "90d203a9-910b-471c-afb5-e487b65136ac" (UID: "90d203a9-910b-471c-afb5-e487b65136ac"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.624853 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7" (OuterVolumeSpecName: "kube-api-access-q7dq7") pod "90d203a9-910b-471c-afb5-e487b65136ac" (UID: "90d203a9-910b-471c-afb5-e487b65136ac"). InnerVolumeSpecName "kube-api-access-q7dq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.624883 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "90d203a9-910b-471c-afb5-e487b65136ac" (UID: "90d203a9-910b-471c-afb5-e487b65136ac"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.632917 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4bbb0e48-d287-42fc-a165-86038d2083c9" (UID: "4bbb0e48-d287-42fc-a165-86038d2083c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.655008 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "328ecaa4-59eb-4707-a320-245636d0c778" (UID: "328ecaa4-59eb-4707-a320-245636d0c778"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.665642 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s9hd2"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723253 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/328ecaa4-59eb-4707-a320-245636d0c778-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723298 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7dq7\" (UniqueName: \"kubernetes.io/projected/90d203a9-910b-471c-afb5-e487b65136ac-kube-api-access-q7dq7\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723309 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723318 4720 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/90d203a9-910b-471c-afb5-e487b65136ac-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.723327 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4bbb0e48-d287-42fc-a165-86038d2083c9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739130 4720 generic.go:334] "Generic (PLEG): container finished" podID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739231 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5qbdf" event={"ID":"4bbb0e48-d287-42fc-a165-86038d2083c9","Type":"ContainerDied","Data":"85ca11cc33d09ce2c8fd7bab9c3118f3fb41bcc9c4f1e36c585b8c6b04ce1492"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739249 4720 scope.go:117] "RemoveContainer" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.739247 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5qbdf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742341 4720 generic.go:334] "Generic (PLEG): container finished" podID="328ecaa4-59eb-4707-a320-245636d0c778" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742372 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742747 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x7575" event={"ID":"328ecaa4-59eb-4707-a320-245636d0c778","Type":"ContainerDied","Data":"b8380163a8adecc8544c40abaa4d48a79fd0c040f667b22d69969e4736058c2d"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.742928 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x7575" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746441 4720 generic.go:334] "Generic (PLEG): container finished" podID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746493 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746518 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c95rn" event={"ID":"8432f9d9-0168-4b49-b6a7-66281f46bd5a","Type":"ContainerDied","Data":"9c2892b80c1d95c871202545822430a42e2c2316e71ccc122df3bcadd593a956"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.746831 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c95rn" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.749233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" event={"ID":"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c","Type":"ContainerStarted","Data":"5ef63d867a51c1b59404172d62818837198f493d69b70fdf881163c4bba9bc7d"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750468 4720 generic.go:334] "Generic (PLEG): container finished" podID="90d203a9-910b-471c-afb5-e487b65136ac" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750557 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerDied","Data":"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" event={"ID":"90d203a9-910b-471c-afb5-e487b65136ac","Type":"ContainerDied","Data":"617f70e18e4e0f9b72a22ff92ce1fc94aae99827e9d16ba9cde606ce5a9e499c"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.750887 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vxdw2" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.753072 4720 generic.go:334] "Generic (PLEG): container finished" podID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" exitCode=0 Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.753130 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v6vwc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.753130 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.754446 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v6vwc" event={"ID":"1d6131a5-b63e-42a5-905a-9ed5350a421a","Type":"ContainerDied","Data":"d3b5cdbc839bad4c3029ff33f78cd38f5b5e460e9963f6c280d92ade619bd510"} Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.769486 4720 scope.go:117] "RemoveContainer" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.780396 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.793139 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5qbdf"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.831340 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.832268 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x7575"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.841632 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.846627 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c95rn"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.850172 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.853089 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v6vwc"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.853557 4720 scope.go:117] "RemoveContainer" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.856788 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.864987 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vxdw2"] Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.866806 4720 scope.go:117] "RemoveContainer" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.867224 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97\": container with ID starting with 37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97 not found: ID does not exist" containerID="37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867253 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97"} err="failed to get container status \"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97\": rpc error: code = NotFound desc = could not find container \"37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97\": container with ID starting with 37feb5de5a4e232cd57fa0f486f5aa5cb3b8174eff24aad08adef76c135bae97 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867273 4720 scope.go:117] "RemoveContainer" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.867447 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604\": container with ID starting with d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604 not found: ID does not exist" containerID="d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867467 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604"} err="failed to get container status \"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604\": rpc error: code = NotFound desc = could not find container \"d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604\": container with ID starting with d6271eba8343a64a510743b92076c0b8cbf3dfa8ac7bd3c494ffad17e390a604 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.867480 4720 scope.go:117] "RemoveContainer" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.868125 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6\": container with ID starting with 913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6 not found: ID does not exist" containerID="913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.868146 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6"} err="failed to get container status \"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6\": rpc error: code = NotFound desc = could not find container \"913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6\": container with ID starting with 913d1830ceed6ae40bbb4c04398f1f327e8c16bbd8735fc74ac413d3ad20a3f6 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.868158 4720 scope.go:117] "RemoveContainer" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.881593 4720 scope.go:117] "RemoveContainer" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.899466 4720 scope.go:117] "RemoveContainer" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.920580 4720 scope.go:117] "RemoveContainer" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.920848 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf\": container with ID starting with 0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf not found: ID does not exist" containerID="0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.920963 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf"} err="failed to get container status \"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf\": rpc error: code = NotFound desc = could not find container \"0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf\": container with ID starting with 0d492e75765297f2fad45ffe41414d6dbafe8642c6bee06687ddf6c8715a19bf not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.921038 4720 scope.go:117] "RemoveContainer" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.921734 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62\": container with ID starting with ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62 not found: ID does not exist" containerID="ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.921841 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62"} err="failed to get container status \"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62\": rpc error: code = NotFound desc = could not find container \"ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62\": container with ID starting with ea8b6d61a3c6cf5240b3168f98e11a3daa5ea4b4ec1019ccba81cc77ae7c8c62 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.921908 4720 scope.go:117] "RemoveContainer" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.922194 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744\": container with ID starting with ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744 not found: ID does not exist" containerID="ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.922271 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744"} err="failed to get container status \"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744\": rpc error: code = NotFound desc = could not find container \"ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744\": container with ID starting with ffff4cc4dd421fc3d140565ccec23e5e6c9e5bcc82c5cbbe3391d78e0a095744 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.922341 4720 scope.go:117] "RemoveContainer" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.943340 4720 scope.go:117] "RemoveContainer" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.956432 4720 scope.go:117] "RemoveContainer" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.974145 4720 scope.go:117] "RemoveContainer" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.974641 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc\": container with ID starting with eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc not found: ID does not exist" containerID="eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.974693 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc"} err="failed to get container status \"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc\": rpc error: code = NotFound desc = could not find container \"eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc\": container with ID starting with eb2ea66de0c15f757e1ea3f3a7ba866b91eccc8717ebe3d5a9ce44767b9405dc not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.974718 4720 scope.go:117] "RemoveContainer" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.975219 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98\": container with ID starting with 0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98 not found: ID does not exist" containerID="0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975248 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98"} err="failed to get container status \"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98\": rpc error: code = NotFound desc = could not find container \"0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98\": container with ID starting with 0c8a7ca936259535e52d9f1f75585d2ac601a4266b11a725dd5c872f792d1b98 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975267 4720 scope.go:117] "RemoveContainer" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.975519 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d\": container with ID starting with aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d not found: ID does not exist" containerID="aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975597 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d"} err="failed to get container status \"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d\": rpc error: code = NotFound desc = could not find container \"aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d\": container with ID starting with aff1c70dad44e4a9235659b4d3aa767982940cb15a22e40293f61a1cba5b043d not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.975685 4720 scope.go:117] "RemoveContainer" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.992837 4720 scope.go:117] "RemoveContainer" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" Jan 21 14:33:42 crc kubenswrapper[4720]: E0121 14:33:42.993273 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705\": container with ID starting with d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705 not found: ID does not exist" containerID="d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.993315 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705"} err="failed to get container status \"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705\": rpc error: code = NotFound desc = could not find container \"d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705\": container with ID starting with d05d82d82af0a06d610676d66e7cb7ca7f7be3c13b938925658fdac6c7476705 not found: ID does not exist" Jan 21 14:33:42 crc kubenswrapper[4720]: I0121 14:33:42.993343 4720 scope.go:117] "RemoveContainer" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.009139 4720 scope.go:117] "RemoveContainer" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.032340 4720 scope.go:117] "RemoveContainer" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.048495 4720 scope.go:117] "RemoveContainer" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" Jan 21 14:33:43 crc kubenswrapper[4720]: E0121 14:33:43.048895 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23\": container with ID starting with c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23 not found: ID does not exist" containerID="c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.048967 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23"} err="failed to get container status \"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23\": rpc error: code = NotFound desc = could not find container \"c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23\": container with ID starting with c0e0fa0c98abdaa51dfb14628a67a65e3156ac9c16cbfadc4748a6699596aa23 not found: ID does not exist" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049017 4720 scope.go:117] "RemoveContainer" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" Jan 21 14:33:43 crc kubenswrapper[4720]: E0121 14:33:43.049376 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb\": container with ID starting with de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb not found: ID does not exist" containerID="de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049401 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb"} err="failed to get container status \"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb\": rpc error: code = NotFound desc = could not find container \"de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb\": container with ID starting with de24d7572be1ab2e8506e6c1275ce9f8ebc0d3feb9e5d97e950e10f251941efb not found: ID does not exist" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049418 4720 scope.go:117] "RemoveContainer" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" Jan 21 14:33:43 crc kubenswrapper[4720]: E0121 14:33:43.049768 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1\": container with ID starting with 30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1 not found: ID does not exist" containerID="30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.049813 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1"} err="failed to get container status \"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1\": rpc error: code = NotFound desc = could not find container \"30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1\": container with ID starting with 30172f5a091dbf43920d3bc422d548a7d74dec60e89e6b9d22bca8e36b6c2ed1 not found: ID does not exist" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.760825 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" event={"ID":"fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c","Type":"ContainerStarted","Data":"94d24540bb35c6fa830ff15b1e745d19fa1bc384917d0d58afcd9bc6efd8f3ad"} Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.761754 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.763737 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" Jan 21 14:33:43 crc kubenswrapper[4720]: I0121 14:33:43.803811 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-s9hd2" podStartSLOduration=2.803789497 podStartE2EDuration="2.803789497s" podCreationTimestamp="2026-01-21 14:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:33:43.784871794 +0000 UTC m=+261.693611726" watchObservedRunningTime="2026-01-21 14:33:43.803789497 +0000 UTC m=+261.712529439" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.287607 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7fb4w"] Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288053 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288068 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288083 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288091 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288106 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288114 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288125 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288135 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288152 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288162 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288173 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288181 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288194 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288201 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288216 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288224 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288235 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288243 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288254 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288262 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="extract-content" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288271 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288279 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288318 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288326 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: E0121 14:33:44.288340 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288348 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="extract-utilities" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288467 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288480 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288492 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d203a9-910b-471c-afb5-e487b65136ac" containerName="marketplace-operator" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288502 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.288519 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="328ecaa4-59eb-4707-a320-245636d0c778" containerName="registry-server" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.289344 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.292801 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.304903 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fb4w"] Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.345702 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcffq\" (UniqueName: \"kubernetes.io/projected/1f47a635-f04f-4002-a264-f10be8c70e10-kube-api-access-zcffq\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.345771 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-catalog-content\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.345921 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-utilities\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447299 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-utilities\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447377 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcffq\" (UniqueName: \"kubernetes.io/projected/1f47a635-f04f-4002-a264-f10be8c70e10-kube-api-access-zcffq\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447414 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-catalog-content\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447913 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-utilities\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.447977 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f47a635-f04f-4002-a264-f10be8c70e10-catalog-content\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.466265 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcffq\" (UniqueName: \"kubernetes.io/projected/1f47a635-f04f-4002-a264-f10be8c70e10-kube-api-access-zcffq\") pod \"redhat-marketplace-7fb4w\" (UID: \"1f47a635-f04f-4002-a264-f10be8c70e10\") " pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.603609 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.695120 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d6131a5-b63e-42a5-905a-9ed5350a421a" path="/var/lib/kubelet/pods/1d6131a5-b63e-42a5-905a-9ed5350a421a/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.696199 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="328ecaa4-59eb-4707-a320-245636d0c778" path="/var/lib/kubelet/pods/328ecaa4-59eb-4707-a320-245636d0c778/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.701088 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bbb0e48-d287-42fc-a165-86038d2083c9" path="/var/lib/kubelet/pods/4bbb0e48-d287-42fc-a165-86038d2083c9/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.702365 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8432f9d9-0168-4b49-b6a7-66281f46bd5a" path="/var/lib/kubelet/pods/8432f9d9-0168-4b49-b6a7-66281f46bd5a/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.707958 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d203a9-910b-471c-afb5-e487b65136ac" path="/var/lib/kubelet/pods/90d203a9-910b-471c-afb5-e487b65136ac/volumes" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.892634 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4hxc8"] Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.894034 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.897327 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.915410 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4hxc8"] Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.954009 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph265\" (UniqueName: \"kubernetes.io/projected/86ba467d-dfbe-493b-acf6-17b938a753b0-kube-api-access-ph265\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.954075 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-catalog-content\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:44 crc kubenswrapper[4720]: I0121 14:33:44.954151 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-utilities\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.017558 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7fb4w"] Jan 21 14:33:45 crc kubenswrapper[4720]: W0121 14:33:45.025252 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f47a635_f04f_4002_a264_f10be8c70e10.slice/crio-a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075 WatchSource:0}: Error finding container a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075: Status 404 returned error can't find the container with id a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075 Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.054722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-catalog-content\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055118 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-catalog-content\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055246 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-utilities\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055491 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86ba467d-dfbe-493b-acf6-17b938a753b0-utilities\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.055541 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph265\" (UniqueName: \"kubernetes.io/projected/86ba467d-dfbe-493b-acf6-17b938a753b0-kube-api-access-ph265\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.074395 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph265\" (UniqueName: \"kubernetes.io/projected/86ba467d-dfbe-493b-acf6-17b938a753b0-kube-api-access-ph265\") pod \"redhat-operators-4hxc8\" (UID: \"86ba467d-dfbe-493b-acf6-17b938a753b0\") " pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.211055 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.587715 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4hxc8"] Jan 21 14:33:45 crc kubenswrapper[4720]: W0121 14:33:45.590923 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86ba467d_dfbe_493b_acf6_17b938a753b0.slice/crio-a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623 WatchSource:0}: Error finding container a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623: Status 404 returned error can't find the container with id a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623 Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.776417 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerStarted","Data":"a1afdaf85bb82edbc95f4c81b12a315744320806ab04c6d8dd48f83c26bcb075"} Jan 21 14:33:45 crc kubenswrapper[4720]: I0121 14:33:45.778102 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerStarted","Data":"a4d607da7df13b8e3893f0d3f0dfdbc6a1d96158509d17718eec429d147f2623"} Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.680752 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.682076 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.687032 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.699068 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.777395 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.777507 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.777535 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.793822 4720 generic.go:334] "Generic (PLEG): container finished" podID="1f47a635-f04f-4002-a264-f10be8c70e10" containerID="9c2b1c72fedf2b087f889500a8bf7249fcb8c582f96a5b18b72fb4e06dd0c998" exitCode=0 Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.793935 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerDied","Data":"9c2b1c72fedf2b087f889500a8bf7249fcb8c582f96a5b18b72fb4e06dd0c998"} Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.795756 4720 generic.go:334] "Generic (PLEG): container finished" podID="86ba467d-dfbe-493b-acf6-17b938a753b0" containerID="78463a7970d148b08439482a23b9dba952d67553710c3a6d71b9d262255a9e61" exitCode=0 Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.795805 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerDied","Data":"78463a7970d148b08439482a23b9dba952d67553710c3a6d71b9d262255a9e61"} Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878138 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878187 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878222 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878792 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.878921 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:46 crc kubenswrapper[4720]: I0121 14:33:46.908419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"certified-operators-kb2c7\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.006558 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.280634 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bqrkw"] Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.288063 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.289758 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqrkw"] Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.289840 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.385629 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc9kv\" (UniqueName: \"kubernetes.io/projected/f9a3c893-2903-4355-9af3-b8f981477494-kube-api-access-sc9kv\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.385700 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-utilities\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.385821 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-catalog-content\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.436689 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 14:33:47 crc kubenswrapper[4720]: W0121 14:33:47.440059 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9a5b258_9d31_4031_85f0_1c8d00da3dda.slice/crio-802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420 WatchSource:0}: Error finding container 802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420: Status 404 returned error can't find the container with id 802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420 Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.486851 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc9kv\" (UniqueName: \"kubernetes.io/projected/f9a3c893-2903-4355-9af3-b8f981477494-kube-api-access-sc9kv\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.486898 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-utilities\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.486927 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-catalog-content\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.487290 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-catalog-content\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.487429 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9a3c893-2903-4355-9af3-b8f981477494-utilities\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.505148 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc9kv\" (UniqueName: \"kubernetes.io/projected/f9a3c893-2903-4355-9af3-b8f981477494-kube-api-access-sc9kv\") pod \"community-operators-bqrkw\" (UID: \"f9a3c893-2903-4355-9af3-b8f981477494\") " pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.607721 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.801524 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerStarted","Data":"802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420"} Jan 21 14:33:47 crc kubenswrapper[4720]: I0121 14:33:47.973384 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqrkw"] Jan 21 14:33:47 crc kubenswrapper[4720]: W0121 14:33:47.981878 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9a3c893_2903_4355_9af3_b8f981477494.slice/crio-248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3 WatchSource:0}: Error finding container 248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3: Status 404 returned error can't find the container with id 248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3 Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.807413 4720 generic.go:334] "Generic (PLEG): container finished" podID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerID="43eba3433cb18996557abdfca43416ddb338165d69b1ca200a34d85ce638dbbb" exitCode=0 Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.807671 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"43eba3433cb18996557abdfca43416ddb338165d69b1ca200a34d85ce638dbbb"} Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.809311 4720 generic.go:334] "Generic (PLEG): container finished" podID="f9a3c893-2903-4355-9af3-b8f981477494" containerID="72b66af0c5d4d88f0ff56206b6cd5a927d24dfc13eb460c55f2bbe2c7c2bb175" exitCode=0 Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.809349 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerDied","Data":"72b66af0c5d4d88f0ff56206b6cd5a927d24dfc13eb460c55f2bbe2c7c2bb175"} Jan 21 14:33:48 crc kubenswrapper[4720]: I0121 14:33:48.809439 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerStarted","Data":"248f1de11ffc02b8afd1183a8141c21d72a9db3ac8c65433f9f98da11bc419f3"} Jan 21 14:33:49 crc kubenswrapper[4720]: I0121 14:33:49.815868 4720 generic.go:334] "Generic (PLEG): container finished" podID="1f47a635-f04f-4002-a264-f10be8c70e10" containerID="bde9b617e649c39bfcf92df7b4beae07bc25e3961c9a5ee920a6300015135379" exitCode=0 Jan 21 14:33:49 crc kubenswrapper[4720]: I0121 14:33:49.815958 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerDied","Data":"bde9b617e649c39bfcf92df7b4beae07bc25e3961c9a5ee920a6300015135379"} Jan 21 14:33:49 crc kubenswrapper[4720]: I0121 14:33:49.819164 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerStarted","Data":"7ca7f8a5c20e3bddacbc12f16d445aca464719b6c925391c8b06c96fdd022163"} Jan 21 14:33:51 crc kubenswrapper[4720]: I0121 14:33:51.829627 4720 generic.go:334] "Generic (PLEG): container finished" podID="86ba467d-dfbe-493b-acf6-17b938a753b0" containerID="7ca7f8a5c20e3bddacbc12f16d445aca464719b6c925391c8b06c96fdd022163" exitCode=0 Jan 21 14:33:51 crc kubenswrapper[4720]: I0121 14:33:51.829707 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerDied","Data":"7ca7f8a5c20e3bddacbc12f16d445aca464719b6c925391c8b06c96fdd022163"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.836986 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerStarted","Data":"3cf94886364cef91e20ae9016e04659436eabd5438c6457787eb48cc78c05d42"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.839727 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7fb4w" event={"ID":"1f47a635-f04f-4002-a264-f10be8c70e10","Type":"ContainerStarted","Data":"4c38f4240a11f753005e29df2dcef41e087ace821f7289909af6860f0f7ff948"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.842273 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerStarted","Data":"c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.848919 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4hxc8" event={"ID":"86ba467d-dfbe-493b-acf6-17b938a753b0","Type":"ContainerStarted","Data":"458c08bd42934ef6f7bfe4dffa6066112a7cc0d9e15db613da2b5ef519eca59a"} Jan 21 14:33:52 crc kubenswrapper[4720]: I0121 14:33:52.913772 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4hxc8" podStartSLOduration=3.50444645 podStartE2EDuration="8.913755524s" podCreationTimestamp="2026-01-21 14:33:44 +0000 UTC" firstStartedPulling="2026-01-21 14:33:46.79716586 +0000 UTC m=+264.705905832" lastFinishedPulling="2026-01-21 14:33:52.206474964 +0000 UTC m=+270.115214906" observedRunningTime="2026-01-21 14:33:52.88989287 +0000 UTC m=+270.798632802" watchObservedRunningTime="2026-01-21 14:33:52.913755524 +0000 UTC m=+270.822495466" Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.856808 4720 generic.go:334] "Generic (PLEG): container finished" podID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerID="c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124" exitCode=0 Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.856899 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124"} Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.860593 4720 generic.go:334] "Generic (PLEG): container finished" podID="f9a3c893-2903-4355-9af3-b8f981477494" containerID="3cf94886364cef91e20ae9016e04659436eabd5438c6457787eb48cc78c05d42" exitCode=0 Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.860640 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerDied","Data":"3cf94886364cef91e20ae9016e04659436eabd5438c6457787eb48cc78c05d42"} Jan 21 14:33:53 crc kubenswrapper[4720]: I0121 14:33:53.907875 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7fb4w" podStartSLOduration=5.491724824 podStartE2EDuration="9.907856339s" podCreationTimestamp="2026-01-21 14:33:44 +0000 UTC" firstStartedPulling="2026-01-21 14:33:46.796412579 +0000 UTC m=+264.705152551" lastFinishedPulling="2026-01-21 14:33:51.212544134 +0000 UTC m=+269.121284066" observedRunningTime="2026-01-21 14:33:52.934838048 +0000 UTC m=+270.843577990" watchObservedRunningTime="2026-01-21 14:33:53.907856339 +0000 UTC m=+271.816596271" Jan 21 14:33:54 crc kubenswrapper[4720]: I0121 14:33:54.604533 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:54 crc kubenswrapper[4720]: I0121 14:33:54.605121 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:54 crc kubenswrapper[4720]: I0121 14:33:54.638961 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:33:55 crc kubenswrapper[4720]: I0121 14:33:55.211401 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:55 crc kubenswrapper[4720]: I0121 14:33:55.211521 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.251030 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4hxc8" podUID="86ba467d-dfbe-493b-acf6-17b938a753b0" containerName="registry-server" probeResult="failure" output=< Jan 21 14:33:56 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:33:56 crc kubenswrapper[4720]: > Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.880389 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerStarted","Data":"e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa"} Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.882358 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqrkw" event={"ID":"f9a3c893-2903-4355-9af3-b8f981477494","Type":"ContainerStarted","Data":"803babe81199b98919d30ecc8f9d07b7ebe605b7beb05ae97585d5105fad8b7f"} Jan 21 14:33:56 crc kubenswrapper[4720]: I0121 14:33:56.942530 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kb2c7" podStartSLOduration=3.928977957 podStartE2EDuration="10.942507093s" podCreationTimestamp="2026-01-21 14:33:46 +0000 UTC" firstStartedPulling="2026-01-21 14:33:48.823671277 +0000 UTC m=+266.732411199" lastFinishedPulling="2026-01-21 14:33:55.837200403 +0000 UTC m=+273.745940335" observedRunningTime="2026-01-21 14:33:56.908044506 +0000 UTC m=+274.816784458" watchObservedRunningTime="2026-01-21 14:33:56.942507093 +0000 UTC m=+274.851247035" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.007710 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.007769 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.607923 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:57 crc kubenswrapper[4720]: I0121 14:33:57.607974 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:33:58 crc kubenswrapper[4720]: I0121 14:33:58.051122 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-kb2c7" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" probeResult="failure" output=< Jan 21 14:33:58 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:33:58 crc kubenswrapper[4720]: > Jan 21 14:33:58 crc kubenswrapper[4720]: I0121 14:33:58.643384 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bqrkw" podUID="f9a3c893-2903-4355-9af3-b8f981477494" containerName="registry-server" probeResult="failure" output=< Jan 21 14:33:58 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:33:58 crc kubenswrapper[4720]: > Jan 21 14:34:04 crc kubenswrapper[4720]: I0121 14:34:04.648783 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7fb4w" Jan 21 14:34:04 crc kubenswrapper[4720]: I0121 14:34:04.667384 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bqrkw" podStartSLOduration=10.546034401 podStartE2EDuration="17.667364344s" podCreationTimestamp="2026-01-21 14:33:47 +0000 UTC" firstStartedPulling="2026-01-21 14:33:48.823806822 +0000 UTC m=+266.732546754" lastFinishedPulling="2026-01-21 14:33:55.945136765 +0000 UTC m=+273.853876697" observedRunningTime="2026-01-21 14:33:56.944300624 +0000 UTC m=+274.853040576" watchObservedRunningTime="2026-01-21 14:34:04.667364344 +0000 UTC m=+282.576104276" Jan 21 14:34:05 crc kubenswrapper[4720]: I0121 14:34:05.249458 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:34:05 crc kubenswrapper[4720]: I0121 14:34:05.295361 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4hxc8" Jan 21 14:34:07 crc kubenswrapper[4720]: I0121 14:34:07.050956 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:34:08 crc kubenswrapper[4720]: I0121 14:34:07.100405 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 14:34:08 crc kubenswrapper[4720]: I0121 14:34:07.650437 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:34:08 crc kubenswrapper[4720]: I0121 14:34:07.690958 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bqrkw" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.517401 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.517837 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" containerID="cri-o://5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" gracePeriod=30 Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.911673 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961426 4720 generic.go:334] "Generic (PLEG): container finished" podID="840dfd09-e274-4c2b-9299-a494100e266d" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" exitCode=0 Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961473 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerDied","Data":"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3"} Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" event={"ID":"840dfd09-e274-4c2b-9299-a494100e266d","Type":"ContainerDied","Data":"a827d68c41cf6bca1d1353db6d4c691cd0bbcd9fa7fef0db59ccff42a67e61f8"} Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961516 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.961525 4720 scope.go:117] "RemoveContainer" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.979461 4720 scope.go:117] "RemoveContainer" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" Jan 21 14:34:11 crc kubenswrapper[4720]: E0121 14:34:11.980814 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3\": container with ID starting with 5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3 not found: ID does not exist" containerID="5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.980853 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3"} err="failed to get container status \"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3\": rpc error: code = NotFound desc = could not find container \"5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3\": container with ID starting with 5c99c09d773097bce63e27d370753212cd035db79226b6d7485a1c5439ac7fb3 not found: ID does not exist" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992370 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992420 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992519 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.992548 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") pod \"840dfd09-e274-4c2b-9299-a494100e266d\" (UID: \"840dfd09-e274-4c2b-9299-a494100e266d\") " Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.993426 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca" (OuterVolumeSpecName: "client-ca") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:11 crc kubenswrapper[4720]: I0121 14:34:11.993447 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config" (OuterVolumeSpecName: "config") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.002155 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.013926 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7" (OuterVolumeSpecName: "kube-api-access-m8cm7") pod "840dfd09-e274-4c2b-9299-a494100e266d" (UID: "840dfd09-e274-4c2b-9299-a494100e266d"). InnerVolumeSpecName "kube-api-access-m8cm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.025539 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfldm"] Jan 21 14:34:12 crc kubenswrapper[4720]: E0121 14:34:12.025768 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.025779 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.025870 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="840dfd09-e274-4c2b-9299-a494100e266d" containerName="route-controller-manager" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.026257 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.046188 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfldm"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102156 4720 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102401 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8cm7\" (UniqueName: \"kubernetes.io/projected/840dfd09-e274-4c2b-9299-a494100e266d-kube-api-access-m8cm7\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102515 4720 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/840dfd09-e274-4c2b-9299-a494100e266d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.102613 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/840dfd09-e274-4c2b-9299-a494100e266d-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.203951 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-trusted-ca\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204023 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-bound-sa-token\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204092 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bcab83dd-6fc2-4f43-b30a-831af267b19d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204114 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-certificates\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204130 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq7qv\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-kube-api-access-zq7qv\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204158 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bcab83dd-6fc2-4f43-b30a-831af267b19d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.204180 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-tls\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.222618 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.289170 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.293819 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5cf5674648-vlcxj"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305707 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-bound-sa-token\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305771 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bcab83dd-6fc2-4f43-b30a-831af267b19d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305798 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-certificates\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305812 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq7qv\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-kube-api-access-zq7qv\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305840 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bcab83dd-6fc2-4f43-b30a-831af267b19d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305859 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-tls\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.305880 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-trusted-ca\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.306465 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bcab83dd-6fc2-4f43-b30a-831af267b19d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.307026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-trusted-ca\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.307300 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-certificates\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.311119 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bcab83dd-6fc2-4f43-b30a-831af267b19d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.311159 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-registry-tls\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.322710 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq7qv\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-kube-api-access-zq7qv\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.324848 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bcab83dd-6fc2-4f43-b30a-831af267b19d-bound-sa-token\") pod \"image-registry-66df7c8f76-hfldm\" (UID: \"bcab83dd-6fc2-4f43-b30a-831af267b19d\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.339762 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.514845 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfldm"] Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.697156 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840dfd09-e274-4c2b-9299-a494100e266d" path="/var/lib/kubelet/pods/840dfd09-e274-4c2b-9299-a494100e266d/volumes" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.970864 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" event={"ID":"bcab83dd-6fc2-4f43-b30a-831af267b19d","Type":"ContainerStarted","Data":"244af74cca6b260cfcf7f641b2db789267c3ba98bc956691a29a8cb874b361bc"} Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.970936 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" event={"ID":"bcab83dd-6fc2-4f43-b30a-831af267b19d","Type":"ContainerStarted","Data":"97a3d53efb195f99cd15100ff87f06556bb4f4d3f7a4a5ad373d106e45f6e42e"} Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.970989 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:12 crc kubenswrapper[4720]: I0121 14:34:12.994815 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" podStartSLOduration=0.994795075 podStartE2EDuration="994.795075ms" podCreationTimestamp="2026-01-21 14:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:34:12.99112101 +0000 UTC m=+290.899860962" watchObservedRunningTime="2026-01-21 14:34:12.994795075 +0000 UTC m=+290.903535007" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.211235 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks"] Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.212006 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.214788 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.215722 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.215738 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.215807 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.216414 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.219301 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.224424 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks"] Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-config\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319367 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724953e6-eb48-401a-b5fd-fb565448db70-serving-cert\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319442 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkk44\" (UniqueName: \"kubernetes.io/projected/724953e6-eb48-401a-b5fd-fb565448db70-kube-api-access-dkk44\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.319554 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-client-ca\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420369 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-client-ca\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420425 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-config\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420458 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724953e6-eb48-401a-b5fd-fb565448db70-serving-cert\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.420506 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkk44\" (UniqueName: \"kubernetes.io/projected/724953e6-eb48-401a-b5fd-fb565448db70-kube-api-access-dkk44\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.421416 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-client-ca\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.422023 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/724953e6-eb48-401a-b5fd-fb565448db70-config\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.426267 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/724953e6-eb48-401a-b5fd-fb565448db70-serving-cert\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.436225 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkk44\" (UniqueName: \"kubernetes.io/projected/724953e6-eb48-401a-b5fd-fb565448db70-kube-api-access-dkk44\") pod \"route-controller-manager-6cbf946f8c-b9rks\" (UID: \"724953e6-eb48-401a-b5fd-fb565448db70\") " pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.565474 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.966450 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks"] Jan 21 14:34:13 crc kubenswrapper[4720]: I0121 14:34:13.980473 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" event={"ID":"724953e6-eb48-401a-b5fd-fb565448db70","Type":"ContainerStarted","Data":"bfd9635fee2a0d38ae67c9bdcfdf17fe7f7c524b1c3046f92e7ebe8fc2ae4624"} Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.003419 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" event={"ID":"724953e6-eb48-401a-b5fd-fb565448db70","Type":"ContainerStarted","Data":"c9b90dc4356cdfbe8ed5c625d916169f6fe0794c10c8acead03e0d635fe33f0a"} Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.004819 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.011607 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" Jan 21 14:34:18 crc kubenswrapper[4720]: I0121 14:34:18.021153 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6cbf946f8c-b9rks" podStartSLOduration=7.021134491 podStartE2EDuration="7.021134491s" podCreationTimestamp="2026-01-21 14:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:34:18.018726532 +0000 UTC m=+295.927466494" watchObservedRunningTime="2026-01-21 14:34:18.021134491 +0000 UTC m=+295.929874423" Jan 21 14:34:22 crc kubenswrapper[4720]: I0121 14:34:22.230103 4720 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 21 14:34:32 crc kubenswrapper[4720]: I0121 14:34:32.351304 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hfldm" Jan 21 14:34:32 crc kubenswrapper[4720]: I0121 14:34:32.427560 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:34:57 crc kubenswrapper[4720]: I0121 14:34:57.485512 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" containerID="cri-o://e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d" gracePeriod=30 Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.221522 4720 generic.go:334] "Generic (PLEG): container finished" podID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerID="e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d" exitCode=0 Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.221576 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerDied","Data":"e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d"} Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.401339 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.423959 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424018 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424190 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424270 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424340 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424700 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424730 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.424787 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") pod \"ccf13312-4caa-4898-9dd3-3f9614ecee01\" (UID: \"ccf13312-4caa-4898-9dd3-3f9614ecee01\") " Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.427287 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.427507 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.432162 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.433048 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.440936 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.443509 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.447677 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd" (OuterVolumeSpecName: "kube-api-access-7jcfd") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "kube-api-access-7jcfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.450716 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ccf13312-4caa-4898-9dd3-3f9614ecee01" (UID: "ccf13312-4caa-4898-9dd3-3f9614ecee01"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526078 4720 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ccf13312-4caa-4898-9dd3-3f9614ecee01-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526115 4720 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ccf13312-4caa-4898-9dd3-3f9614ecee01-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526124 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jcfd\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-kube-api-access-7jcfd\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526135 4720 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526144 4720 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526180 4720 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ccf13312-4caa-4898-9dd3-3f9614ecee01-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:58 crc kubenswrapper[4720]: I0121 14:34:58.526189 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ccf13312-4caa-4898-9dd3-3f9614ecee01-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.228688 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" event={"ID":"ccf13312-4caa-4898-9dd3-3f9614ecee01","Type":"ContainerDied","Data":"cc78447803378e22f6cbae3e9270bdc6d0ee1630fceb9cd43ec6c839a71ce985"} Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.228778 4720 scope.go:117] "RemoveContainer" containerID="e3c917729cbed0b95bf83042f4024bb09e5fcf08063dcc9274062f7754ff9a3d" Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.228770 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6kjwf" Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.247811 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:34:59 crc kubenswrapper[4720]: I0121 14:34:59.254476 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6kjwf"] Jan 21 14:35:00 crc kubenswrapper[4720]: I0121 14:35:00.685155 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" path="/var/lib/kubelet/pods/ccf13312-4caa-4898-9dd3-3f9614ecee01/volumes" Jan 21 14:35:22 crc kubenswrapper[4720]: I0121 14:35:22.880226 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:35:22 crc kubenswrapper[4720]: I0121 14:35:22.880916 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:35:52 crc kubenswrapper[4720]: I0121 14:35:52.880575 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:35:52 crc kubenswrapper[4720]: I0121 14:35:52.881172 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.880219 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.880942 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.881004 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.881756 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:36:22 crc kubenswrapper[4720]: I0121 14:36:22.881842 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1" gracePeriod=600 Jan 21 14:36:23 crc kubenswrapper[4720]: I0121 14:36:23.682065 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1" exitCode=0 Jan 21 14:36:23 crc kubenswrapper[4720]: I0121 14:36:23.682106 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1"} Jan 21 14:36:23 crc kubenswrapper[4720]: I0121 14:36:23.682136 4720 scope.go:117] "RemoveContainer" containerID="926a9b75c9fc74a93dd69c62eb765f3cdb4aeaf1bc918f7c3dc8f79011404240" Jan 21 14:36:24 crc kubenswrapper[4720]: I0121 14:36:24.688027 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c"} Jan 21 14:36:56 crc kubenswrapper[4720]: I0121 14:36:56.808922 4720 scope.go:117] "RemoveContainer" containerID="14e886daf1a3a6b869ffcf74d313a6df0c2abaf901b1048767f8b1caf48b8b35" Jan 21 14:37:56 crc kubenswrapper[4720]: I0121 14:37:56.841385 4720 scope.go:117] "RemoveContainer" containerID="828c55378e558356171a9771b0f3cab050cb198f63a03e622439dc4e677f234d" Jan 21 14:37:56 crc kubenswrapper[4720]: I0121 14:37:56.871676 4720 scope.go:117] "RemoveContainer" containerID="a7b59502380a4895dd54770bd6aaf04fdda2243417fd15217dfcddfff65770ae" Jan 21 14:38:52 crc kubenswrapper[4720]: I0121 14:38:52.880601 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:38:52 crc kubenswrapper[4720]: I0121 14:38:52.881365 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:39:22 crc kubenswrapper[4720]: I0121 14:39:22.880251 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:39:22 crc kubenswrapper[4720]: I0121 14:39:22.880899 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.122553 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5"] Jan 21 14:39:34 crc kubenswrapper[4720]: E0121 14:39:34.123444 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.123462 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.123590 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf13312-4caa-4898-9dd3-3f9614ecee01" containerName="registry" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.124295 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.126731 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.126938 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.127179 4720 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-k9tzj" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.137071 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.150756 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-d6jp2"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.151528 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.154475 4720 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hvqqj" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.179511 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-d6jp2"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.184104 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vflwv"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.184876 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.187891 4720 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rr7l9" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.199682 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vflwv"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.240835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p9sc\" (UniqueName: \"kubernetes.io/projected/4939bfdd-b3b4-4850-8b5d-3399548ad5a0-kube-api-access-9p9sc\") pod \"cert-manager-cainjector-cf98fcc89-c4tn5\" (UID: \"4939bfdd-b3b4-4850-8b5d-3399548ad5a0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.240886 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b69np\" (UniqueName: \"kubernetes.io/projected/4eec0898-8a1a-47d9-ac37-62cfe6c7b857-kube-api-access-b69np\") pod \"cert-manager-858654f9db-d6jp2\" (UID: \"4eec0898-8a1a-47d9-ac37-62cfe6c7b857\") " pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.240979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp6f6\" (UniqueName: \"kubernetes.io/projected/0236eaa4-e5d8-4699-82f8-1e9648f95dc8-kube-api-access-kp6f6\") pod \"cert-manager-webhook-687f57d79b-vflwv\" (UID: \"0236eaa4-e5d8-4699-82f8-1e9648f95dc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.341715 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p9sc\" (UniqueName: \"kubernetes.io/projected/4939bfdd-b3b4-4850-8b5d-3399548ad5a0-kube-api-access-9p9sc\") pod \"cert-manager-cainjector-cf98fcc89-c4tn5\" (UID: \"4939bfdd-b3b4-4850-8b5d-3399548ad5a0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.341767 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b69np\" (UniqueName: \"kubernetes.io/projected/4eec0898-8a1a-47d9-ac37-62cfe6c7b857-kube-api-access-b69np\") pod \"cert-manager-858654f9db-d6jp2\" (UID: \"4eec0898-8a1a-47d9-ac37-62cfe6c7b857\") " pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.341822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp6f6\" (UniqueName: \"kubernetes.io/projected/0236eaa4-e5d8-4699-82f8-1e9648f95dc8-kube-api-access-kp6f6\") pod \"cert-manager-webhook-687f57d79b-vflwv\" (UID: \"0236eaa4-e5d8-4699-82f8-1e9648f95dc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.362869 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b69np\" (UniqueName: \"kubernetes.io/projected/4eec0898-8a1a-47d9-ac37-62cfe6c7b857-kube-api-access-b69np\") pod \"cert-manager-858654f9db-d6jp2\" (UID: \"4eec0898-8a1a-47d9-ac37-62cfe6c7b857\") " pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.363710 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp6f6\" (UniqueName: \"kubernetes.io/projected/0236eaa4-e5d8-4699-82f8-1e9648f95dc8-kube-api-access-kp6f6\") pod \"cert-manager-webhook-687f57d79b-vflwv\" (UID: \"0236eaa4-e5d8-4699-82f8-1e9648f95dc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.365607 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p9sc\" (UniqueName: \"kubernetes.io/projected/4939bfdd-b3b4-4850-8b5d-3399548ad5a0-kube-api-access-9p9sc\") pod \"cert-manager-cainjector-cf98fcc89-c4tn5\" (UID: \"4939bfdd-b3b4-4850-8b5d-3399548ad5a0\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.440920 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.466109 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-d6jp2" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.500227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.784991 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.802160 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.836246 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-d6jp2"] Jan 21 14:39:34 crc kubenswrapper[4720]: I0121 14:39:34.864945 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-vflwv"] Jan 21 14:39:34 crc kubenswrapper[4720]: W0121 14:39:34.866289 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0236eaa4_e5d8_4699_82f8_1e9648f95dc8.slice/crio-222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d WatchSource:0}: Error finding container 222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d: Status 404 returned error can't find the container with id 222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d Jan 21 14:39:35 crc kubenswrapper[4720]: I0121 14:39:35.784048 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-d6jp2" event={"ID":"4eec0898-8a1a-47d9-ac37-62cfe6c7b857","Type":"ContainerStarted","Data":"91480a298275598c96f67adb1602c43e4fccf021c122b5c3fdaaf9be02d132cf"} Jan 21 14:39:35 crc kubenswrapper[4720]: I0121 14:39:35.787093 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" event={"ID":"0236eaa4-e5d8-4699-82f8-1e9648f95dc8","Type":"ContainerStarted","Data":"222b99aa1177b46333389096afaf13753b0a749d17617b49b55146ced6373a4d"} Jan 21 14:39:35 crc kubenswrapper[4720]: I0121 14:39:35.789562 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" event={"ID":"4939bfdd-b3b4-4850-8b5d-3399548ad5a0","Type":"ContainerStarted","Data":"b430360181832cde5ba7b8ff85d38fc1cde96fbe9868850fb1d3a474e26a3a3c"} Jan 21 14:39:37 crc kubenswrapper[4720]: I0121 14:39:37.801259 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" event={"ID":"4939bfdd-b3b4-4850-8b5d-3399548ad5a0","Type":"ContainerStarted","Data":"600a6ace4d38f05f38eb69d88c28bdcc5d2daea310b75beec57f95c5e3e43dc0"} Jan 21 14:39:37 crc kubenswrapper[4720]: I0121 14:39:37.826227 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-c4tn5" podStartSLOduration=1.582203775 podStartE2EDuration="3.82620782s" podCreationTimestamp="2026-01-21 14:39:34 +0000 UTC" firstStartedPulling="2026-01-21 14:39:34.80189582 +0000 UTC m=+612.710635752" lastFinishedPulling="2026-01-21 14:39:37.045899865 +0000 UTC m=+614.954639797" observedRunningTime="2026-01-21 14:39:37.820087489 +0000 UTC m=+615.728827431" watchObservedRunningTime="2026-01-21 14:39:37.82620782 +0000 UTC m=+615.734947772" Jan 21 14:39:38 crc kubenswrapper[4720]: I0121 14:39:38.807305 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" event={"ID":"0236eaa4-e5d8-4699-82f8-1e9648f95dc8","Type":"ContainerStarted","Data":"f53b18ac17718378a8c351acf908be0f328901f8b9cb647748741bd3372d412a"} Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.813550 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-d6jp2" event={"ID":"4eec0898-8a1a-47d9-ac37-62cfe6c7b857","Type":"ContainerStarted","Data":"245195f570219ab9a529c7c627643788975777c677bc8e0e705a3f27df79e779"} Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.814226 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.833542 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" podStartSLOduration=2.225109085 podStartE2EDuration="5.833522222s" podCreationTimestamp="2026-01-21 14:39:34 +0000 UTC" firstStartedPulling="2026-01-21 14:39:34.868917494 +0000 UTC m=+612.777657426" lastFinishedPulling="2026-01-21 14:39:38.477330631 +0000 UTC m=+616.386070563" observedRunningTime="2026-01-21 14:39:39.830124508 +0000 UTC m=+617.738864450" watchObservedRunningTime="2026-01-21 14:39:39.833522222 +0000 UTC m=+617.742262154" Jan 21 14:39:39 crc kubenswrapper[4720]: I0121 14:39:39.846401 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-d6jp2" podStartSLOduration=2.154915713 podStartE2EDuration="5.846386241s" podCreationTimestamp="2026-01-21 14:39:34 +0000 UTC" firstStartedPulling="2026-01-21 14:39:34.844644848 +0000 UTC m=+612.753384770" lastFinishedPulling="2026-01-21 14:39:38.536115366 +0000 UTC m=+616.444855298" observedRunningTime="2026-01-21 14:39:39.845055524 +0000 UTC m=+617.753795456" watchObservedRunningTime="2026-01-21 14:39:39.846386241 +0000 UTC m=+617.755126173" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.474647 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478100 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" containerID="cri-o://625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478306 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" containerID="cri-o://d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478492 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478697 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" containerID="cri-o://4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.478865 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" containerID="cri-o://259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.477574 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" containerID="cri-o://aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.479245 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" containerID="cri-o://cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.509414 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" containerID="cri-o://b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" gracePeriod=30 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834132 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/0.log" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834177 4720 generic.go:334] "Generic (PLEG): container finished" podID="a40805c6-ef8a-4ae0-bb5b-1834d257e8c6" containerID="3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61" exitCode=2 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834224 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerDied","Data":"3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.834638 4720 scope.go:117] "RemoveContainer" containerID="3df2e65ca3b78094d1f1a647b130e272d7eff6699626e3dace56d3c8488f9d61" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.842204 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-acl-logging/0.log" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.842715 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-controller/0.log" Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843186 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843219 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843229 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843238 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" exitCode=0 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843244 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" exitCode=143 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843250 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" exitCode=143 Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843270 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843293 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843318 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843331 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843340 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} Jan 21 14:39:43 crc kubenswrapper[4720]: I0121 14:39:43.843348 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.143601 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-acl-logging/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.144408 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-controller/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.144830 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199443 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2jn6"] Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199622 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199634 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199644 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199650 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199677 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199682 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199691 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199696 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199707 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199712 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199721 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199726 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199735 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kubecfg-setup" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199741 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kubecfg-setup" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199749 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199755 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" Jan 21 14:39:44 crc kubenswrapper[4720]: E0121 14:39:44.199765 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199771 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199865 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-acl-logging" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199878 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-node" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199888 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="nbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199897 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="kube-rbac-proxy-ovn-metrics" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199906 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovnkube-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199914 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="sbdb" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199922 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="ovn-controller" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.199930 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerName="northd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.201702 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220086 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220104 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220150 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220169 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220189 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220201 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220253 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220266 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220281 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220304 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220333 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220357 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220379 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220407 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220424 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220442 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220462 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") pod \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\" (UID: \"ac61c15b-6fe9-4c83-9ca7-588095ab1a29\") " Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220530 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220547 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220555 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220584 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220601 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log" (OuterVolumeSpecName: "node-log") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220608 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220626 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash" (OuterVolumeSpecName: "host-slash") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220647 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220689 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220887 4720 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220903 4720 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220914 4720 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-node-log\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220925 4720 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220936 4720 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220946 4720 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220959 4720 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220971 4720 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-slash\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220983 4720 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220943 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220962 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221005 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220966 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.220986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221056 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221112 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.221046 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket" (OuterVolumeSpecName: "log-socket") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.237775 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.238508 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r" (OuterVolumeSpecName: "kube-api-access-kvf2r") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "kube-api-access-kvf2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.239867 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ac61c15b-6fe9-4c83-9ca7-588095ab1a29" (UID: "ac61c15b-6fe9-4c83-9ca7-588095ab1a29"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322464 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-log-socket\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322530 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-kubelet\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322552 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-netd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322607 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-var-lib-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322625 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-bin\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322690 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-systemd-units\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322707 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-config\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322722 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-netns\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322736 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-script-lib\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322758 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-systemd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322775 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovn-node-metrics-cert\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.322853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323015 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-ovn\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323044 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-node-log\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323067 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323095 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jxb8\" (UniqueName: \"kubernetes.io/projected/60550f21-b0dd-410b-a4be-cba72e8b7b71-kube-api-access-6jxb8\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323129 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-env-overrides\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323158 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323182 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-etc-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323215 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-slash\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323312 4720 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323328 4720 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323341 4720 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-log-socket\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323353 4720 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323364 4720 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323375 4720 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323389 4720 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323402 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323413 4720 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323424 4720 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.323436 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvf2r\" (UniqueName: \"kubernetes.io/projected/ac61c15b-6fe9-4c83-9ca7-588095ab1a29-kube-api-access-kvf2r\") on node \"crc\" DevicePath \"\"" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424330 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-systemd-units\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424370 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-config\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424388 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-netns\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424405 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-script-lib\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424436 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-systemd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424452 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovn-node-metrics-cert\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424470 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424501 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-ovn\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424515 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-node-log\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424528 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424544 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jxb8\" (UniqueName: \"kubernetes.io/projected/60550f21-b0dd-410b-a4be-cba72e8b7b71-kube-api-access-6jxb8\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424562 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-env-overrides\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424578 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424592 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-etc-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-slash\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424624 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-log-socket\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424642 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-kubelet\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-netd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424686 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-var-lib-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424702 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-bin\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424749 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-bin\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.424783 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-netns\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425394 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-script-lib\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425426 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-systemd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425507 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovnkube-config\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425558 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-systemd-units\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425595 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425631 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425683 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-ovn\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425715 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-node-log\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425744 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-run-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-log-socket\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425812 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-etc-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425834 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-slash\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425855 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-var-lib-openvswitch\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425860 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-cni-netd\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.425938 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60550f21-b0dd-410b-a4be-cba72e8b7b71-host-kubelet\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.426089 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60550f21-b0dd-410b-a4be-cba72e8b7b71-env-overrides\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.429357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60550f21-b0dd-410b-a4be-cba72e8b7b71-ovn-node-metrics-cert\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.447100 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jxb8\" (UniqueName: \"kubernetes.io/projected/60550f21-b0dd-410b-a4be-cba72e8b7b71-kube-api-access-6jxb8\") pod \"ovnkube-node-w2jn6\" (UID: \"60550f21-b0dd-410b-a4be-cba72e8b7b71\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.502768 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-vflwv" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.512227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.849024 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.849084 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85dm" event={"ID":"a40805c6-ef8a-4ae0-bb5b-1834d257e8c6","Type":"ContainerStarted","Data":"acc666dd42119bae6cd2f607818a28cef7871b82ccc16b234457bb8f06955709"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.853357 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-acl-logging/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.853888 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zr5bd_ac61c15b-6fe9-4c83-9ca7-588095ab1a29/ovn-controller/0.log" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854209 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" exitCode=0 Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854229 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" exitCode=0 Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854324 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854840 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854877 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854891 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zr5bd" event={"ID":"ac61c15b-6fe9-4c83-9ca7-588095ab1a29","Type":"ContainerDied","Data":"ca1757282192974108b64124881bc36690fc3400e42954719815f361ddc7c63e"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.854909 4720 scope.go:117] "RemoveContainer" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.856414 4720 generic.go:334] "Generic (PLEG): container finished" podID="60550f21-b0dd-410b-a4be-cba72e8b7b71" containerID="ffa7c67075bb066d8c8ca6301e43653894e8f39c98a52992e9346cf1d8c920e6" exitCode=0 Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.856433 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerDied","Data":"ffa7c67075bb066d8c8ca6301e43653894e8f39c98a52992e9346cf1d8c920e6"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.856447 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"53cdfe98eeecad403bbed2b730519ce0b8a2a7a987916f6952257d761eb1338b"} Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.880858 4720 scope.go:117] "RemoveContainer" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.911809 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.911918 4720 scope.go:117] "RemoveContainer" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.918681 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zr5bd"] Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.927238 4720 scope.go:117] "RemoveContainer" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.945117 4720 scope.go:117] "RemoveContainer" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.963060 4720 scope.go:117] "RemoveContainer" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.975144 4720 scope.go:117] "RemoveContainer" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.985882 4720 scope.go:117] "RemoveContainer" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:44 crc kubenswrapper[4720]: I0121 14:39:44.999025 4720 scope.go:117] "RemoveContainer" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.011898 4720 scope.go:117] "RemoveContainer" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.012608 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": container with ID starting with b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39 not found: ID does not exist" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012647 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} err="failed to get container status \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": rpc error: code = NotFound desc = could not find container \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": container with ID starting with b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012694 4720 scope.go:117] "RemoveContainer" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.012923 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": container with ID starting with cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff not found: ID does not exist" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012959 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} err="failed to get container status \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": rpc error: code = NotFound desc = could not find container \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": container with ID starting with cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.012974 4720 scope.go:117] "RemoveContainer" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.013164 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": container with ID starting with 625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc not found: ID does not exist" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013185 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} err="failed to get container status \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": rpc error: code = NotFound desc = could not find container \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": container with ID starting with 625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013208 4720 scope.go:117] "RemoveContainer" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.013393 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": container with ID starting with d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0 not found: ID does not exist" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013415 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} err="failed to get container status \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": rpc error: code = NotFound desc = could not find container \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": container with ID starting with d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013428 4720 scope.go:117] "RemoveContainer" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.013608 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": container with ID starting with 3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a not found: ID does not exist" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013629 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} err="failed to get container status \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": rpc error: code = NotFound desc = could not find container \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": container with ID starting with 3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.013668 4720 scope.go:117] "RemoveContainer" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.014232 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": container with ID starting with 4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136 not found: ID does not exist" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014252 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} err="failed to get container status \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": rpc error: code = NotFound desc = could not find container \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": container with ID starting with 4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014265 4720 scope.go:117] "RemoveContainer" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.014527 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": container with ID starting with 259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9 not found: ID does not exist" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014547 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} err="failed to get container status \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": rpc error: code = NotFound desc = could not find container \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": container with ID starting with 259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014563 4720 scope.go:117] "RemoveContainer" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.014855 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": container with ID starting with aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556 not found: ID does not exist" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014877 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} err="failed to get container status \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": rpc error: code = NotFound desc = could not find container \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": container with ID starting with aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.014891 4720 scope.go:117] "RemoveContainer" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: E0121 14:39:45.015149 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": container with ID starting with 3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2 not found: ID does not exist" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.015175 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2"} err="failed to get container status \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": rpc error: code = NotFound desc = could not find container \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": container with ID starting with 3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.015192 4720 scope.go:117] "RemoveContainer" containerID="b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.016869 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39"} err="failed to get container status \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": rpc error: code = NotFound desc = could not find container \"b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39\": container with ID starting with b87e56ea354fc6a8d23bff77c9d817ab2f0481a52e7536b33cff8e842d3acc39 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.016898 4720 scope.go:117] "RemoveContainer" containerID="cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017196 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff"} err="failed to get container status \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": rpc error: code = NotFound desc = could not find container \"cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff\": container with ID starting with cc403e7370f4c0e180578f7c937d1c540b58b911d0dcde263edb0332846542ff not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017221 4720 scope.go:117] "RemoveContainer" containerID="625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017450 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc"} err="failed to get container status \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": rpc error: code = NotFound desc = could not find container \"625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc\": container with ID starting with 625271ad4ff015cec75e604df3cd6c896dadaa55c6eea3a035e02ffaace285cc not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017470 4720 scope.go:117] "RemoveContainer" containerID="d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017695 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0"} err="failed to get container status \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": rpc error: code = NotFound desc = could not find container \"d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0\": container with ID starting with d1e39c6489267605567083bbf58d52dc83783ab46cf6468a4a619fdd22a893c0 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017714 4720 scope.go:117] "RemoveContainer" containerID="3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017941 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a"} err="failed to get container status \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": rpc error: code = NotFound desc = could not find container \"3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a\": container with ID starting with 3ec88cc895f2b82eea19c4a927605f7196f008760619664f069a2d889cbe5c5a not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.017963 4720 scope.go:117] "RemoveContainer" containerID="4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018229 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136"} err="failed to get container status \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": rpc error: code = NotFound desc = could not find container \"4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136\": container with ID starting with 4056ab2c599299e02ae3a697f4141c19f2ef8508967564b6eb65b03665cc2136 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018247 4720 scope.go:117] "RemoveContainer" containerID="259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018413 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9"} err="failed to get container status \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": rpc error: code = NotFound desc = could not find container \"259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9\": container with ID starting with 259075e5d56293bfdd3c160453e38651da3e67e117efc2e3c9012ac4918c0bf9 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.018440 4720 scope.go:117] "RemoveContainer" containerID="aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.019154 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556"} err="failed to get container status \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": rpc error: code = NotFound desc = could not find container \"aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556\": container with ID starting with aae9671a8f7e4c811bae63647d082185a7a92f1832f12a8387956c7bb6aab556 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.019239 4720 scope.go:117] "RemoveContainer" containerID="3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.019976 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2"} err="failed to get container status \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": rpc error: code = NotFound desc = could not find container \"3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2\": container with ID starting with 3644c0d176037d9e6f92e84047317c19d376ed8369e5503be68c782d1277bff2 not found: ID does not exist" Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.881197 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"20f752df985e275fc170d7f09f0fda0fa79b977ca7e9c54386b56bf70a664352"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882703 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"e6df46ce549cf47027353bd50dfa269d320f9b65242e840b9bca91af4bc8bb02"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"d53f25e750b9c7df4a0a690ac6bb64af4d058ad69480c307e32df5881f6d78c9"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882864 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"593c6ed2601f8bcc6a2c5dff92288fba00edade13ad8613b8646f6236049e490"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.882934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"1181bddc1a3f560e3a5d183aaf6b578f0be8d4305fa36e63c13e101b915b6d86"} Jan 21 14:39:45 crc kubenswrapper[4720]: I0121 14:39:45.883138 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"cbbfa58798f5909f5d588eedb36f15bc96de2bd901929cc01f7e493297c79976"} Jan 21 14:39:46 crc kubenswrapper[4720]: I0121 14:39:46.684831 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac61c15b-6fe9-4c83-9ca7-588095ab1a29" path="/var/lib/kubelet/pods/ac61c15b-6fe9-4c83-9ca7-588095ab1a29/volumes" Jan 21 14:39:48 crc kubenswrapper[4720]: I0121 14:39:48.904149 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"fbf18024f843adc9e3c3ccb274c6ee33723a30f7251259ff499f4e13cbd98576"} Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.879729 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.880254 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.880311 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.880980 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.881031 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c" gracePeriod=600 Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.948058 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" event={"ID":"60550f21-b0dd-410b-a4be-cba72e8b7b71","Type":"ContainerStarted","Data":"d1e3c4978e0d34c293a9b1d777692003c1bbe558e4cd835ce33d3759a90bad05"} Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.948472 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.948572 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:52 crc kubenswrapper[4720]: I0121 14:39:52.980396 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.022945 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" podStartSLOduration=9.022928146 podStartE2EDuration="9.022928146s" podCreationTimestamp="2026-01-21 14:39:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:39:52.978240203 +0000 UTC m=+630.886980145" watchObservedRunningTime="2026-01-21 14:39:53.022928146 +0000 UTC m=+630.931668078" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.955972 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c" exitCode=0 Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.957245 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c"} Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.957281 4720 scope.go:117] "RemoveContainer" containerID="eab7230c9b1780824322550642987ab8759942bce4be148af7dcc4a247edffb1" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.957385 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:53 crc kubenswrapper[4720]: I0121 14:39:53.989133 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:39:54 crc kubenswrapper[4720]: I0121 14:39:54.965610 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47"} Jan 21 14:40:14 crc kubenswrapper[4720]: I0121 14:40:14.538225 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2jn6" Jan 21 14:40:16 crc kubenswrapper[4720]: I0121 14:40:16.323206 4720 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.711380 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw"] Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.713141 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.715066 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.729254 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw"] Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.829344 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.829728 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.830016 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.931920 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932044 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932148 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932605 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.932896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:29 crc kubenswrapper[4720]: I0121 14:40:29.963880 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.029528 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.272062 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw"] Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.965923 4720 generic.go:334] "Generic (PLEG): container finished" podID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerID="18b85be22696d7a1e91179c4db1f586e24b4d0f58de335c96bb3ab80b6d2a3b1" exitCode=0 Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.966005 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"18b85be22696d7a1e91179c4db1f586e24b4d0f58de335c96bb3ab80b6d2a3b1"} Jan 21 14:40:30 crc kubenswrapper[4720]: I0121 14:40:30.966241 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerStarted","Data":"7dc037c255fe5d539fe598aa5fa5e4707047078369eaf44896cfb3a6c5f1899e"} Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.039443 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.042453 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.056105 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.159992 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.160073 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.160212 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261302 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261601 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261641 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.261777 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.262090 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.282258 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"redhat-operators-ppkd7\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.373161 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.781884 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.978445 4720 generic.go:334] "Generic (PLEG): container finished" podID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" exitCode=0 Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.978508 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d"} Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.978856 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerStarted","Data":"b483512bc397003ba094d82107302d250056bd93ed3604599c9f018730b4610a"} Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.981761 4720 generic.go:334] "Generic (PLEG): container finished" podID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerID="a02146b329cdb9125ea485ddeff1b2ec486f6e3ba61d778148451d89eb67ef1f" exitCode=0 Jan 21 14:40:32 crc kubenswrapper[4720]: I0121 14:40:32.981792 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"a02146b329cdb9125ea485ddeff1b2ec486f6e3ba61d778148451d89eb67ef1f"} Jan 21 14:40:33 crc kubenswrapper[4720]: I0121 14:40:33.994035 4720 generic.go:334] "Generic (PLEG): container finished" podID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerID="b966931718143442e396a12e6ac157054dc8452123b9bf8cc73c2cf135f05ad2" exitCode=0 Jan 21 14:40:33 crc kubenswrapper[4720]: I0121 14:40:33.994098 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"b966931718143442e396a12e6ac157054dc8452123b9bf8cc73c2cf135f05ad2"} Jan 21 14:40:33 crc kubenswrapper[4720]: I0121 14:40:33.996775 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerStarted","Data":"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da"} Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.006288 4720 generic.go:334] "Generic (PLEG): container finished" podID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" exitCode=0 Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.007013 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da"} Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.295118 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.401903 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") pod \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.402045 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") pod \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.402085 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") pod \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\" (UID: \"d714bdab-c0dc-4710-bae5-ec08841d2c0d\") " Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.402642 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle" (OuterVolumeSpecName: "bundle") pod "d714bdab-c0dc-4710-bae5-ec08841d2c0d" (UID: "d714bdab-c0dc-4710-bae5-ec08841d2c0d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.407716 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h" (OuterVolumeSpecName: "kube-api-access-5w99h") pod "d714bdab-c0dc-4710-bae5-ec08841d2c0d" (UID: "d714bdab-c0dc-4710-bae5-ec08841d2c0d"). InnerVolumeSpecName "kube-api-access-5w99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.421273 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util" (OuterVolumeSpecName: "util") pod "d714bdab-c0dc-4710-bae5-ec08841d2c0d" (UID: "d714bdab-c0dc-4710-bae5-ec08841d2c0d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.503668 4720 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.503706 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w99h\" (UniqueName: \"kubernetes.io/projected/d714bdab-c0dc-4710-bae5-ec08841d2c0d-kube-api-access-5w99h\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:35 crc kubenswrapper[4720]: I0121 14:40:35.503722 4720 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d714bdab-c0dc-4710-bae5-ec08841d2c0d-util\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.013118 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" event={"ID":"d714bdab-c0dc-4710-bae5-ec08841d2c0d","Type":"ContainerDied","Data":"7dc037c255fe5d539fe598aa5fa5e4707047078369eaf44896cfb3a6c5f1899e"} Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.013171 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dc037c255fe5d539fe598aa5fa5e4707047078369eaf44896cfb3a6c5f1899e" Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.013131 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw" Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.014772 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerStarted","Data":"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee"} Jan 21 14:40:36 crc kubenswrapper[4720]: I0121 14:40:36.037221 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppkd7" podStartSLOduration=1.488256801 podStartE2EDuration="4.037200697s" podCreationTimestamp="2026-01-21 14:40:32 +0000 UTC" firstStartedPulling="2026-01-21 14:40:32.979785067 +0000 UTC m=+670.888524999" lastFinishedPulling="2026-01-21 14:40:35.528728953 +0000 UTC m=+673.437468895" observedRunningTime="2026-01-21 14:40:36.035970353 +0000 UTC m=+673.944710315" watchObservedRunningTime="2026-01-21 14:40:36.037200697 +0000 UTC m=+673.945940639" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948425 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mclmr"] Jan 21 14:40:39 crc kubenswrapper[4720]: E0121 14:40:39.948698 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="pull" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948714 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="pull" Jan 21 14:40:39 crc kubenswrapper[4720]: E0121 14:40:39.948734 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="util" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948742 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="util" Jan 21 14:40:39 crc kubenswrapper[4720]: E0121 14:40:39.948753 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="extract" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948762 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="extract" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.948883 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d714bdab-c0dc-4710-bae5-ec08841d2c0d" containerName="extract" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.949345 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.958150 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.960251 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.964167 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-bsqcl" Jan 21 14:40:39 crc kubenswrapper[4720]: I0121 14:40:39.965580 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mclmr"] Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.062157 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrf2\" (UniqueName: \"kubernetes.io/projected/2bdd7be0-b9cf-4501-9816-87831d74becc-kube-api-access-qqrf2\") pod \"nmstate-operator-646758c888-mclmr\" (UID: \"2bdd7be0-b9cf-4501-9816-87831d74becc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.163736 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrf2\" (UniqueName: \"kubernetes.io/projected/2bdd7be0-b9cf-4501-9816-87831d74becc-kube-api-access-qqrf2\") pod \"nmstate-operator-646758c888-mclmr\" (UID: \"2bdd7be0-b9cf-4501-9816-87831d74becc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.184505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrf2\" (UniqueName: \"kubernetes.io/projected/2bdd7be0-b9cf-4501-9816-87831d74becc-kube-api-access-qqrf2\") pod \"nmstate-operator-646758c888-mclmr\" (UID: \"2bdd7be0-b9cf-4501-9816-87831d74becc\") " pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.300452 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" Jan 21 14:40:40 crc kubenswrapper[4720]: I0121 14:40:40.694005 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-mclmr"] Jan 21 14:40:40 crc kubenswrapper[4720]: W0121 14:40:40.702633 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bdd7be0_b9cf_4501_9816_87831d74becc.slice/crio-f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e WatchSource:0}: Error finding container f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e: Status 404 returned error can't find the container with id f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e Jan 21 14:40:41 crc kubenswrapper[4720]: I0121 14:40:41.041395 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" event={"ID":"2bdd7be0-b9cf-4501-9816-87831d74becc","Type":"ContainerStarted","Data":"f3fc166b7896a8194e4111a838e6cb0e94e68a03aca1ce44a67721ee703bf85e"} Jan 21 14:40:42 crc kubenswrapper[4720]: I0121 14:40:42.373537 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:42 crc kubenswrapper[4720]: I0121 14:40:42.373945 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:42 crc kubenswrapper[4720]: I0121 14:40:42.433043 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:43 crc kubenswrapper[4720]: I0121 14:40:43.118065 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:44 crc kubenswrapper[4720]: I0121 14:40:44.825165 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:45 crc kubenswrapper[4720]: I0121 14:40:45.066579 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" event={"ID":"2bdd7be0-b9cf-4501-9816-87831d74becc","Type":"ContainerStarted","Data":"4a394ea7c9dd01b0b3fdaa7b8a60225bc738f14651f025e0d795b97cfa1cda8e"} Jan 21 14:40:45 crc kubenswrapper[4720]: I0121 14:40:45.091813 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-mclmr" podStartSLOduration=2.588743798 podStartE2EDuration="6.091782446s" podCreationTimestamp="2026-01-21 14:40:39 +0000 UTC" firstStartedPulling="2026-01-21 14:40:40.705236204 +0000 UTC m=+678.613976126" lastFinishedPulling="2026-01-21 14:40:44.208274842 +0000 UTC m=+682.117014774" observedRunningTime="2026-01-21 14:40:45.082354407 +0000 UTC m=+682.991094379" watchObservedRunningTime="2026-01-21 14:40:45.091782446 +0000 UTC m=+683.000522438" Jan 21 14:40:46 crc kubenswrapper[4720]: I0121 14:40:46.072827 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppkd7" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" containerID="cri-o://5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" gracePeriod=2 Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.606856 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.658313 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") pod \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.658408 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") pod \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.658460 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") pod \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\" (UID: \"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d\") " Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.665259 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27" (OuterVolumeSpecName: "kube-api-access-k8v27") pod "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" (UID: "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d"). InnerVolumeSpecName "kube-api-access-k8v27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.674877 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities" (OuterVolumeSpecName: "utilities") pod "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" (UID: "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.761396 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8v27\" (UniqueName: \"kubernetes.io/projected/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-kube-api-access-k8v27\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.761436 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.773410 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" (UID: "55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:40:47 crc kubenswrapper[4720]: I0121 14:40:47.862747 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.091634 4720 generic.go:334] "Generic (PLEG): container finished" podID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" exitCode=0 Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.091737 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppkd7" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.091744 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee"} Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.092079 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppkd7" event={"ID":"55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d","Type":"ContainerDied","Data":"b483512bc397003ba094d82107302d250056bd93ed3604599c9f018730b4610a"} Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.092153 4720 scope.go:117] "RemoveContainer" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.120318 4720 scope.go:117] "RemoveContainer" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.132405 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.145336 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppkd7"] Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.155069 4720 scope.go:117] "RemoveContainer" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.175140 4720 scope.go:117] "RemoveContainer" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" Jan 21 14:40:48 crc kubenswrapper[4720]: E0121 14:40:48.175680 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee\": container with ID starting with 5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee not found: ID does not exist" containerID="5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.175720 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee"} err="failed to get container status \"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee\": rpc error: code = NotFound desc = could not find container \"5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee\": container with ID starting with 5b717aa8166bcced6eba974fa17f7a03f7c406a7aa0f052008d77dd36594e8ee not found: ID does not exist" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.175745 4720 scope.go:117] "RemoveContainer" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" Jan 21 14:40:48 crc kubenswrapper[4720]: E0121 14:40:48.176078 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da\": container with ID starting with b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da not found: ID does not exist" containerID="b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.176109 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da"} err="failed to get container status \"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da\": rpc error: code = NotFound desc = could not find container \"b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da\": container with ID starting with b37b2645d757618a460ec0ba7037e8f644faae046e2a6225453a408f0c8bb1da not found: ID does not exist" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.176127 4720 scope.go:117] "RemoveContainer" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" Jan 21 14:40:48 crc kubenswrapper[4720]: E0121 14:40:48.176526 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d\": container with ID starting with e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d not found: ID does not exist" containerID="e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.176554 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d"} err="failed to get container status \"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d\": rpc error: code = NotFound desc = could not find container \"e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d\": container with ID starting with e95f0524a8ae71eeafa020f68f9429ac2654410c54f283a742c84a9ce386676d not found: ID does not exist" Jan 21 14:40:48 crc kubenswrapper[4720]: I0121 14:40:48.692180 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" path="/var/lib/kubelet/pods/55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d/volumes" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.032901 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-j9dxt"] Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.033200 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-content" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033236 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-content" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.033263 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033273 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.033286 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-utilities" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033294 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="extract-utilities" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.033432 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a4bfc3-65ad-4f72-b2e9-5b7fe3ca9f3d" containerName="registry-server" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.034238 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.038995 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-msgbw" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.043096 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.043861 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.070464 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.078622 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-l74mh"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.079357 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.093392 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wb6w\" (UniqueName: \"kubernetes.io/projected/c338dc84-0c3a-44c4-8f08-82001f532c2b-kube-api-access-7wb6w\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.093445 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj5n5\" (UniqueName: \"kubernetes.io/projected/a26c9332-5a74-49a3-8347-45ae67cb1c90-kube-api-access-nj5n5\") pod \"nmstate-metrics-54757c584b-j9dxt\" (UID: \"a26c9332-5a74-49a3-8347-45ae67cb1c90\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.093506 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.099803 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-j9dxt"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.108071 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195285 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-dbus-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195348 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-ovs-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195391 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wb6w\" (UniqueName: \"kubernetes.io/projected/c338dc84-0c3a-44c4-8f08-82001f532c2b-kube-api-access-7wb6w\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195424 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj5n5\" (UniqueName: \"kubernetes.io/projected/a26c9332-5a74-49a3-8347-45ae67cb1c90-kube-api-access-nj5n5\") pod \"nmstate-metrics-54757c584b-j9dxt\" (UID: \"a26c9332-5a74-49a3-8347-45ae67cb1c90\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195449 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7bbp\" (UniqueName: \"kubernetes.io/projected/da16493b-aa03-4556-b3ce-d87ccfdbba70-kube-api-access-b7bbp\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195496 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-nmstate-lock\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.195544 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.195674 4720 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.195724 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair podName:c338dc84-0c3a-44c4-8f08-82001f532c2b nodeName:}" failed. No retries permitted until 2026-01-21 14:40:50.695703574 +0000 UTC m=+688.604443506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-xcckr" (UID: "c338dc84-0c3a-44c4-8f08-82001f532c2b") : secret "openshift-nmstate-webhook" not found Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.223868 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj5n5\" (UniqueName: \"kubernetes.io/projected/a26c9332-5a74-49a3-8347-45ae67cb1c90-kube-api-access-nj5n5\") pod \"nmstate-metrics-54757c584b-j9dxt\" (UID: \"a26c9332-5a74-49a3-8347-45ae67cb1c90\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.227594 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wb6w\" (UniqueName: \"kubernetes.io/projected/c338dc84-0c3a-44c4-8f08-82001f532c2b-kube-api-access-7wb6w\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.228064 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.228908 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.231242 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.234807 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.235020 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-shtmv" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.283756 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296451 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bbp\" (UniqueName: \"kubernetes.io/projected/da16493b-aa03-4556-b3ce-d87ccfdbba70-kube-api-access-b7bbp\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296520 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-nmstate-lock\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296576 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296598 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296614 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-dbus-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296613 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-nmstate-lock\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296635 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m88pd\" (UniqueName: \"kubernetes.io/projected/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-kube-api-access-m88pd\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296751 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-ovs-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296879 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-ovs-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.296912 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/da16493b-aa03-4556-b3ce-d87ccfdbba70-dbus-socket\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.316372 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7bbp\" (UniqueName: \"kubernetes.io/projected/da16493b-aa03-4556-b3ce-d87ccfdbba70-kube-api-access-b7bbp\") pod \"nmstate-handler-l74mh\" (UID: \"da16493b-aa03-4556-b3ce-d87ccfdbba70\") " pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.385062 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.399462 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.399501 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.399527 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m88pd\" (UniqueName: \"kubernetes.io/projected/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-kube-api-access-m88pd\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.400318 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.400399 4720 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Jan 21 14:40:50 crc kubenswrapper[4720]: E0121 14:40:50.400438 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert podName:e3d11ff0-1741-4f0d-aa50-6e0144e843a6 nodeName:}" failed. No retries permitted until 2026-01-21 14:40:50.900425231 +0000 UTC m=+688.809165163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert") pod "nmstate-console-plugin-7754f76f8b-f9sxz" (UID: "e3d11ff0-1741-4f0d-aa50-6e0144e843a6") : secret "plugin-serving-cert" not found Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.411346 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.434706 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m88pd\" (UniqueName: \"kubernetes.io/projected/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-kube-api-access-m88pd\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.439941 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85d688dff7-p76qd"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.440540 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.457483 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d688dff7-p76qd"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500529 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c44d\" (UniqueName: \"kubernetes.io/projected/c3e9bed0-25b4-4616-a0f2-44bd9950735a-kube-api-access-6c44d\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500582 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-trusted-ca-bundle\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500629 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-service-ca\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500712 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500769 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-oauth-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500866 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.500955 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-oauth-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602185 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-oauth-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602551 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602612 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-oauth-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602688 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c44d\" (UniqueName: \"kubernetes.io/projected/c3e9bed0-25b4-4616-a0f2-44bd9950735a-kube-api-access-6c44d\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.602706 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-trusted-ca-bundle\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.603997 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-service-ca\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.604026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.604631 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-trusted-ca-bundle\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.605323 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-service-ca\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.605453 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.607938 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c3e9bed0-25b4-4616-a0f2-44bd9950735a-oauth-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.608119 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-serving-cert\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.610197 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c3e9bed0-25b4-4616-a0f2-44bd9950735a-console-oauth-config\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.620476 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c44d\" (UniqueName: \"kubernetes.io/projected/c3e9bed0-25b4-4616-a0f2-44bd9950735a-kube-api-access-6c44d\") pod \"console-85d688dff7-p76qd\" (UID: \"c3e9bed0-25b4-4616-a0f2-44bd9950735a\") " pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.705480 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.708349 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c338dc84-0c3a-44c4-8f08-82001f532c2b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-xcckr\" (UID: \"c338dc84-0c3a-44c4-8f08-82001f532c2b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.767251 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.847123 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-j9dxt"] Jan 21 14:40:50 crc kubenswrapper[4720]: W0121 14:40:50.852528 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda26c9332_5a74_49a3_8347_45ae67cb1c90.slice/crio-a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f WatchSource:0}: Error finding container a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f: Status 404 returned error can't find the container with id a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.910003 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.914696 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3d11ff0-1741-4f0d-aa50-6e0144e843a6-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-f9sxz\" (UID: \"e3d11ff0-1741-4f0d-aa50-6e0144e843a6\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.999397 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85d688dff7-p76qd"] Jan 21 14:40:50 crc kubenswrapper[4720]: I0121 14:40:50.999644 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.131640 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-l74mh" event={"ID":"da16493b-aa03-4556-b3ce-d87ccfdbba70","Type":"ContainerStarted","Data":"3608421bc3bbf14b1488c1288b54ed71445a35fb46df1d0473279fc089317f50"} Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.135917 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d688dff7-p76qd" event={"ID":"c3e9bed0-25b4-4616-a0f2-44bd9950735a","Type":"ContainerStarted","Data":"30f95b7fd6a59d9c58be086c6ff7c2bdb9861aeb840e7c98162b7d2846eee49f"} Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.142504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" event={"ID":"a26c9332-5a74-49a3-8347-45ae67cb1c90","Type":"ContainerStarted","Data":"a8942c9fa1f3f0ea00278be371f291668f10bca3120dfd7e12b6acb07611a68f"} Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.175093 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr"] Jan 21 14:40:51 crc kubenswrapper[4720]: W0121 14:40:51.179418 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc338dc84_0c3a_44c4_8f08_82001f532c2b.slice/crio-e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb WatchSource:0}: Error finding container e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb: Status 404 returned error can't find the container with id e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.187561 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" Jan 21 14:40:51 crc kubenswrapper[4720]: I0121 14:40:51.347482 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz"] Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.148682 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" event={"ID":"c338dc84-0c3a-44c4-8f08-82001f532c2b","Type":"ContainerStarted","Data":"e000b77831ffb4c7c8c77500e18dc7fe469e7fef951d3e3495de89f2e7b22cdb"} Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.150234 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" event={"ID":"e3d11ff0-1741-4f0d-aa50-6e0144e843a6","Type":"ContainerStarted","Data":"39a30cf9fcd6ba2b91e5e4b720fa7877f84ef0dda663c4815eb617330ea1de1b"} Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.152558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85d688dff7-p76qd" event={"ID":"c3e9bed0-25b4-4616-a0f2-44bd9950735a","Type":"ContainerStarted","Data":"6e6863e309843893eab584e997d0e46fc9f1e40de1882f6dcc8dcbf411b19942"} Jan 21 14:40:52 crc kubenswrapper[4720]: I0121 14:40:52.178302 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85d688dff7-p76qd" podStartSLOduration=2.178281668 podStartE2EDuration="2.178281668s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:40:52.170862334 +0000 UTC m=+690.079602276" watchObservedRunningTime="2026-01-21 14:40:52.178281668 +0000 UTC m=+690.087021600" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.170585 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" event={"ID":"a26c9332-5a74-49a3-8347-45ae67cb1c90","Type":"ContainerStarted","Data":"609fa9440b839fda98e31da0aaea3d146ccc38efcd5f5dce5a7909cab95a01ed"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.173400 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" event={"ID":"e3d11ff0-1741-4f0d-aa50-6e0144e843a6","Type":"ContainerStarted","Data":"1b55122c65ef537e2f7d4eb8f11ce825b69eb82673e877e3e769fd2951c26ea8"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.175595 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-l74mh" event={"ID":"da16493b-aa03-4556-b3ce-d87ccfdbba70","Type":"ContainerStarted","Data":"51e7f37323fdf4b7ed7edfbdf268006aebd143b1f8de67b38812f9745a65b891"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.175838 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.178181 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" event={"ID":"c338dc84-0c3a-44c4-8f08-82001f532c2b","Type":"ContainerStarted","Data":"b7abfc80b708d9aabad82f3e03ab3091cd274b923e9c304125a7639791b500ef"} Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.178362 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.192938 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-f9sxz" podStartSLOduration=2.454574379 podStartE2EDuration="5.192917183s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:51.35405548 +0000 UTC m=+689.262795412" lastFinishedPulling="2026-01-21 14:40:54.092398274 +0000 UTC m=+692.001138216" observedRunningTime="2026-01-21 14:40:55.189768867 +0000 UTC m=+693.098508839" watchObservedRunningTime="2026-01-21 14:40:55.192917183 +0000 UTC m=+693.101657135" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.248543 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" podStartSLOduration=2.330175158 podStartE2EDuration="5.24852013s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:51.181061894 +0000 UTC m=+689.089801826" lastFinishedPulling="2026-01-21 14:40:54.099406856 +0000 UTC m=+692.008146798" observedRunningTime="2026-01-21 14:40:55.244444158 +0000 UTC m=+693.153184090" watchObservedRunningTime="2026-01-21 14:40:55.24852013 +0000 UTC m=+693.157260072" Jan 21 14:40:55 crc kubenswrapper[4720]: I0121 14:40:55.252752 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-l74mh" podStartSLOduration=1.622857306 podStartE2EDuration="5.252741355s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:50.470805652 +0000 UTC m=+688.379545584" lastFinishedPulling="2026-01-21 14:40:54.100689681 +0000 UTC m=+692.009429633" observedRunningTime="2026-01-21 14:40:55.216299535 +0000 UTC m=+693.125039497" watchObservedRunningTime="2026-01-21 14:40:55.252741355 +0000 UTC m=+693.161481287" Jan 21 14:40:57 crc kubenswrapper[4720]: I0121 14:40:57.208516 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" event={"ID":"a26c9332-5a74-49a3-8347-45ae67cb1c90","Type":"ContainerStarted","Data":"7aacb0a71bf3d361b9ec556c4329119139f871922ad3541968fe7c9537a421d6"} Jan 21 14:40:57 crc kubenswrapper[4720]: I0121 14:40:57.229065 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-j9dxt" podStartSLOduration=1.6149096680000001 podStartE2EDuration="7.229043317s" podCreationTimestamp="2026-01-21 14:40:50 +0000 UTC" firstStartedPulling="2026-01-21 14:40:50.858639475 +0000 UTC m=+688.767379407" lastFinishedPulling="2026-01-21 14:40:56.472773124 +0000 UTC m=+694.381513056" observedRunningTime="2026-01-21 14:40:57.221414498 +0000 UTC m=+695.130154440" watchObservedRunningTime="2026-01-21 14:40:57.229043317 +0000 UTC m=+695.137783249" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.432554 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-l74mh" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.768146 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.768501 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:00 crc kubenswrapper[4720]: I0121 14:41:00.772946 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:01 crc kubenswrapper[4720]: I0121 14:41:01.237417 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85d688dff7-p76qd" Jan 21 14:41:01 crc kubenswrapper[4720]: I0121 14:41:01.284439 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:41:11 crc kubenswrapper[4720]: I0121 14:41:11.005690 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-xcckr" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.650310 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb"] Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.652079 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.654496 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.663610 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb"] Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.743430 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.743521 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.743631 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846008 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846407 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846893 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.846441 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.847026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.882690 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:23 crc kubenswrapper[4720]: I0121 14:41:23.968195 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:24 crc kubenswrapper[4720]: I0121 14:41:24.373951 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb"] Jan 21 14:41:24 crc kubenswrapper[4720]: W0121 14:41:24.385847 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93611686_cfcc_4f9b_985d_a8e0d9cb7219.slice/crio-6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488 WatchSource:0}: Error finding container 6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488: Status 404 returned error can't find the container with id 6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488 Jan 21 14:41:25 crc kubenswrapper[4720]: I0121 14:41:25.373872 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerStarted","Data":"6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488"} Jan 21 14:41:26 crc kubenswrapper[4720]: I0121 14:41:26.345229 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" containerID="cri-o://d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" gracePeriod=15 Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.292883 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-42g76_ac15d591-5558-4df9-b596-a1e27325bd6c/console/0.log" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.293313 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388794 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-42g76_ac15d591-5558-4df9-b596-a1e27325bd6c/console/0.log" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388848 4720 generic.go:334] "Generic (PLEG): container finished" podID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" exitCode=2 Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388905 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerDied","Data":"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1"} Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388943 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-42g76" event={"ID":"ac15d591-5558-4df9-b596-a1e27325bd6c","Type":"ContainerDied","Data":"28165debc992515a62bbac33db73e05a5347bebc002b160765e6c1b991bcf92e"} Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.388941 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-42g76" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.389009 4720 scope.go:117] "RemoveContainer" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.390685 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.390921 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.392867 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.392926 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.392972 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.393003 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.393051 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") pod \"ac15d591-5558-4df9-b596-a1e27325bd6c\" (UID: \"ac15d591-5558-4df9-b596-a1e27325bd6c\") " Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.391542 4720 generic.go:334] "Generic (PLEG): container finished" podID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerID="d317812bc13480a9a7c6599235163fbdc34765b9104211917cffb37875fd7f5c" exitCode=0 Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.391576 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"d317812bc13480a9a7c6599235163fbdc34765b9104211917cffb37875fd7f5c"} Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.391876 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.393948 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config" (OuterVolumeSpecName: "console-config") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.394522 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca" (OuterVolumeSpecName: "service-ca") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.396192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.401148 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.407103 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f" (OuterVolumeSpecName: "kube-api-access-nzt2f") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "kube-api-access-nzt2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.408192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ac15d591-5558-4df9-b596-a1e27325bd6c" (UID: "ac15d591-5558-4df9-b596-a1e27325bd6c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.460807 4720 scope.go:117] "RemoveContainer" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" Jan 21 14:41:27 crc kubenswrapper[4720]: E0121 14:41:27.461247 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1\": container with ID starting with d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1 not found: ID does not exist" containerID="d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.461342 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1"} err="failed to get container status \"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1\": rpc error: code = NotFound desc = could not find container \"d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1\": container with ID starting with d017a8fc53b7b34bb07b21c6bfe71cb99f83c209c27a0147afc5ce68886d64e1 not found: ID does not exist" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494077 4720 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494100 4720 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494110 4720 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-service-ca\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494118 4720 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-console-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494126 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzt2f\" (UniqueName: \"kubernetes.io/projected/ac15d591-5558-4df9-b596-a1e27325bd6c-kube-api-access-nzt2f\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494134 4720 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac15d591-5558-4df9-b596-a1e27325bd6c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.494141 4720 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac15d591-5558-4df9-b596-a1e27325bd6c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.715326 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:41:27 crc kubenswrapper[4720]: I0121 14:41:27.719785 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-42g76"] Jan 21 14:41:28 crc kubenswrapper[4720]: I0121 14:41:28.169002 4720 patch_prober.go:28] interesting pod/console-f9d7485db-42g76 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 21 14:41:28 crc kubenswrapper[4720]: I0121 14:41:28.169066 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-f9d7485db-42g76" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 21 14:41:28 crc kubenswrapper[4720]: I0121 14:41:28.685963 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" path="/var/lib/kubelet/pods/ac15d591-5558-4df9-b596-a1e27325bd6c/volumes" Jan 21 14:41:29 crc kubenswrapper[4720]: I0121 14:41:29.416236 4720 generic.go:334] "Generic (PLEG): container finished" podID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerID="7c6456fe74570fb8e91805cb05b2f00cdcee03d21bd6b62636db15e2ddc2cacc" exitCode=0 Jan 21 14:41:29 crc kubenswrapper[4720]: I0121 14:41:29.416298 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"7c6456fe74570fb8e91805cb05b2f00cdcee03d21bd6b62636db15e2ddc2cacc"} Jan 21 14:41:30 crc kubenswrapper[4720]: I0121 14:41:30.423355 4720 generic.go:334] "Generic (PLEG): container finished" podID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerID="b1513ae31d6d95b27359f88b9e6cce6e2d01bfa32ab6bb5a175814a2e3252d12" exitCode=0 Jan 21 14:41:30 crc kubenswrapper[4720]: I0121 14:41:30.423475 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"b1513ae31d6d95b27359f88b9e6cce6e2d01bfa32ab6bb5a175814a2e3252d12"} Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.720865 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.750291 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") pod \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.750408 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") pod \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.750450 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") pod \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\" (UID: \"93611686-cfcc-4f9b-985d-a8e0d9cb7219\") " Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.753343 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle" (OuterVolumeSpecName: "bundle") pod "93611686-cfcc-4f9b-985d-a8e0d9cb7219" (UID: "93611686-cfcc-4f9b-985d-a8e0d9cb7219"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.757389 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w" (OuterVolumeSpecName: "kube-api-access-4l99w") pod "93611686-cfcc-4f9b-985d-a8e0d9cb7219" (UID: "93611686-cfcc-4f9b-985d-a8e0d9cb7219"). InnerVolumeSpecName "kube-api-access-4l99w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.770851 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util" (OuterVolumeSpecName: "util") pod "93611686-cfcc-4f9b-985d-a8e0d9cb7219" (UID: "93611686-cfcc-4f9b-985d-a8e0d9cb7219"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.852087 4720 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-util\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.852129 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l99w\" (UniqueName: \"kubernetes.io/projected/93611686-cfcc-4f9b-985d-a8e0d9cb7219-kube-api-access-4l99w\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:31 crc kubenswrapper[4720]: I0121 14:41:31.852140 4720 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/93611686-cfcc-4f9b-985d-a8e0d9cb7219-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:32 crc kubenswrapper[4720]: I0121 14:41:32.435499 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" event={"ID":"93611686-cfcc-4f9b-985d-a8e0d9cb7219","Type":"ContainerDied","Data":"6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488"} Jan 21 14:41:32 crc kubenswrapper[4720]: I0121 14:41:32.435539 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c06759cbf3c34255e046bc9f5a37ffe9f7084c2fa80c769c0aa258efba94488" Jan 21 14:41:32 crc kubenswrapper[4720]: I0121 14:41:32.435542 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.402893 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403680 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="pull" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403695 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="pull" Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403709 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="util" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403717 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="util" Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403735 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403743 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" Jan 21 14:41:39 crc kubenswrapper[4720]: E0121 14:41:39.403751 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="extract" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403759 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="extract" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403879 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="93611686-cfcc-4f9b-985d-a8e0d9cb7219" containerName="extract" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.403895 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac15d591-5558-4df9-b596-a1e27325bd6c" containerName="console" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.404820 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.426822 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.455207 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.455260 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.455293 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.556970 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557054 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557572 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.557615 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.584585 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"community-operators-s476k\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:39 crc kubenswrapper[4720]: I0121 14:41:39.721082 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.138294 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.482036 4720 generic.go:334] "Generic (PLEG): container finished" podID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerID="c75f1a265d59a3285caa8248a4c804dda31a451dcadbae4bb1862a14414777d1" exitCode=0 Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.482127 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"c75f1a265d59a3285caa8248a4c804dda31a451dcadbae4bb1862a14414777d1"} Jan 21 14:41:40 crc kubenswrapper[4720]: I0121 14:41:40.482271 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerStarted","Data":"501dc062329c9af7f8d1683a77c7040d8ab41ee73e94a74219700bf01c887a58"} Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.489474 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerStarted","Data":"35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae"} Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.752396 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67"] Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.753485 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.757688 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.757929 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.761078 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dhnxl" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.761444 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.761960 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.778915 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67"] Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.785917 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.785997 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mdt2\" (UniqueName: \"kubernetes.io/projected/b6fdd799-fe82-4cd7-b825-c755b6189180-kube-api-access-7mdt2\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.786019 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-webhook-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.887044 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mdt2\" (UniqueName: \"kubernetes.io/projected/b6fdd799-fe82-4cd7-b825-c755b6189180-kube-api-access-7mdt2\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.887084 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-webhook-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.887131 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.893231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-webhook-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.894157 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6fdd799-fe82-4cd7-b825-c755b6189180-apiservice-cert\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:41 crc kubenswrapper[4720]: I0121 14:41:41.918437 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mdt2\" (UniqueName: \"kubernetes.io/projected/b6fdd799-fe82-4cd7-b825-c755b6189180-kube-api-access-7mdt2\") pod \"metallb-operator-controller-manager-7b8c8cff46-cbv67\" (UID: \"b6fdd799-fe82-4cd7-b825-c755b6189180\") " pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.067111 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.108409 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz"] Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.109516 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.120314 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.120839 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.121145 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4bkg6" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.133318 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz"] Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.190360 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-webhook-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.190410 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-apiservice-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.190452 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhw7n\" (UniqueName: \"kubernetes.io/projected/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-kube-api-access-vhw7n\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.291295 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-webhook-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.291641 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-apiservice-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.292350 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhw7n\" (UniqueName: \"kubernetes.io/projected/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-kube-api-access-vhw7n\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.308481 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-apiservice-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.312435 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-webhook-cert\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.344556 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhw7n\" (UniqueName: \"kubernetes.io/projected/6c334ce5-b6c7-40c8-a261-5a5084ae3db8-kube-api-access-vhw7n\") pod \"metallb-operator-webhook-server-75df998c5f-tnbdz\" (UID: \"6c334ce5-b6c7-40c8-a261-5a5084ae3db8\") " pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.431983 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.497876 4720 generic.go:334] "Generic (PLEG): container finished" podID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerID="35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae" exitCode=0 Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.497925 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae"} Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.760987 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67"] Jan 21 14:41:42 crc kubenswrapper[4720]: W0121 14:41:42.769039 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6fdd799_fe82_4cd7_b825_c755b6189180.slice/crio-8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158 WatchSource:0}: Error finding container 8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158: Status 404 returned error can't find the container with id 8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158 Jan 21 14:41:42 crc kubenswrapper[4720]: I0121 14:41:42.788096 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz"] Jan 21 14:41:42 crc kubenswrapper[4720]: W0121 14:41:42.795499 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c334ce5_b6c7_40c8_a261_5a5084ae3db8.slice/crio-ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1 WatchSource:0}: Error finding container ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1: Status 404 returned error can't find the container with id ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1 Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.505363 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" event={"ID":"b6fdd799-fe82-4cd7-b825-c755b6189180","Type":"ContainerStarted","Data":"8180a71a8cb386db3be4db1912caf2db0f8136bfc9916b47624320e599eb4158"} Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.509160 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerStarted","Data":"68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9"} Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.510430 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" event={"ID":"6c334ce5-b6c7-40c8-a261-5a5084ae3db8","Type":"ContainerStarted","Data":"ddafffc1fc78333e9df9e6ed6a51bed5e45a5f0e0d71f5ae024b45bd1b87a6d1"} Jan 21 14:41:43 crc kubenswrapper[4720]: I0121 14:41:43.530539 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s476k" podStartSLOduration=2.070362448 podStartE2EDuration="4.530519021s" podCreationTimestamp="2026-01-21 14:41:39 +0000 UTC" firstStartedPulling="2026-01-21 14:41:40.483311944 +0000 UTC m=+738.392051876" lastFinishedPulling="2026-01-21 14:41:42.943468517 +0000 UTC m=+740.852208449" observedRunningTime="2026-01-21 14:41:43.525643159 +0000 UTC m=+741.434383101" watchObservedRunningTime="2026-01-21 14:41:43.530519021 +0000 UTC m=+741.439258953" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.553309 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" event={"ID":"6c334ce5-b6c7-40c8-a261-5a5084ae3db8","Type":"ContainerStarted","Data":"60e8f62f1c6586e4983f90dd3e72d9a9553f94285ab954fb178b144a36b88655"} Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.553842 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.554573 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" event={"ID":"b6fdd799-fe82-4cd7-b825-c755b6189180","Type":"ContainerStarted","Data":"3bc2cb9b972e30d0f4c9ba67b9f3df87a323c8fc889fe67ba255fdf3a5d02197"} Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.554799 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.578209 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" podStartSLOduration=1.304114934 podStartE2EDuration="7.578193621s" podCreationTimestamp="2026-01-21 14:41:42 +0000 UTC" firstStartedPulling="2026-01-21 14:41:42.800080454 +0000 UTC m=+740.708820386" lastFinishedPulling="2026-01-21 14:41:49.074159141 +0000 UTC m=+746.982899073" observedRunningTime="2026-01-21 14:41:49.576960529 +0000 UTC m=+747.485700471" watchObservedRunningTime="2026-01-21 14:41:49.578193621 +0000 UTC m=+747.486933553" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.603259 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" podStartSLOduration=2.382912654 podStartE2EDuration="8.603242195s" podCreationTimestamp="2026-01-21 14:41:41 +0000 UTC" firstStartedPulling="2026-01-21 14:41:42.774061605 +0000 UTC m=+740.682801537" lastFinishedPulling="2026-01-21 14:41:48.994391156 +0000 UTC m=+746.903131078" observedRunningTime="2026-01-21 14:41:49.597306293 +0000 UTC m=+747.506046235" watchObservedRunningTime="2026-01-21 14:41:49.603242195 +0000 UTC m=+747.511982127" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.722054 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.722106 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:49 crc kubenswrapper[4720]: I0121 14:41:49.765691 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:50 crc kubenswrapper[4720]: I0121 14:41:50.605138 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:50 crc kubenswrapper[4720]: I0121 14:41:50.648785 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:52 crc kubenswrapper[4720]: I0121 14:41:52.570314 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s476k" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" containerID="cri-o://68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9" gracePeriod=2 Jan 21 14:41:53 crc kubenswrapper[4720]: I0121 14:41:53.576072 4720 generic.go:334] "Generic (PLEG): container finished" podID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerID="68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9" exitCode=0 Jan 21 14:41:53 crc kubenswrapper[4720]: I0121 14:41:53.576218 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9"} Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.136715 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.285827 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") pod \"488fea59-5b8b-41f0-82c4-e148ffe21d66\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.285877 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") pod \"488fea59-5b8b-41f0-82c4-e148ffe21d66\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.285908 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") pod \"488fea59-5b8b-41f0-82c4-e148ffe21d66\" (UID: \"488fea59-5b8b-41f0-82c4-e148ffe21d66\") " Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.286859 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities" (OuterVolumeSpecName: "utilities") pod "488fea59-5b8b-41f0-82c4-e148ffe21d66" (UID: "488fea59-5b8b-41f0-82c4-e148ffe21d66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.292864 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9" (OuterVolumeSpecName: "kube-api-access-gmhc9") pod "488fea59-5b8b-41f0-82c4-e148ffe21d66" (UID: "488fea59-5b8b-41f0-82c4-e148ffe21d66"). InnerVolumeSpecName "kube-api-access-gmhc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.342008 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "488fea59-5b8b-41f0-82c4-e148ffe21d66" (UID: "488fea59-5b8b-41f0-82c4-e148ffe21d66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.387237 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmhc9\" (UniqueName: \"kubernetes.io/projected/488fea59-5b8b-41f0-82c4-e148ffe21d66-kube-api-access-gmhc9\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.387275 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.387286 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488fea59-5b8b-41f0-82c4-e148ffe21d66-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.582561 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s476k" event={"ID":"488fea59-5b8b-41f0-82c4-e148ffe21d66","Type":"ContainerDied","Data":"501dc062329c9af7f8d1683a77c7040d8ab41ee73e94a74219700bf01c887a58"} Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.582615 4720 scope.go:117] "RemoveContainer" containerID="68a5b5e50b102ff4372ff39c20b1802b9ea9e1447a4d35e5ea702d07eb3dd7c9" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.582622 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s476k" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.611629 4720 scope.go:117] "RemoveContainer" containerID="35ae302a057d3c43620b7d12dcea42a48331849da309b9a105b1e699591e8bae" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.630994 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.655300 4720 scope.go:117] "RemoveContainer" containerID="c75f1a265d59a3285caa8248a4c804dda31a451dcadbae4bb1862a14414777d1" Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.657107 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s476k"] Jan 21 14:41:54 crc kubenswrapper[4720]: I0121 14:41:54.683793 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" path="/var/lib/kubelet/pods/488fea59-5b8b-41f0-82c4-e148ffe21d66/volumes" Jan 21 14:42:02 crc kubenswrapper[4720]: I0121 14:42:02.436786 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-75df998c5f-tnbdz" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.071085 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b8c8cff46-cbv67" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.879972 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.880031 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905076 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-ldp4q"] Jan 21 14:42:22 crc kubenswrapper[4720]: E0121 14:42:22.905353 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905376 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" Jan 21 14:42:22 crc kubenswrapper[4720]: E0121 14:42:22.905392 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-content" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905401 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-content" Jan 21 14:42:22 crc kubenswrapper[4720]: E0121 14:42:22.905422 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-utilities" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905431 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="extract-utilities" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.905551 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="488fea59-5b8b-41f0-82c4-e148ffe21d66" containerName="registry-server" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.907951 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.911158 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.911492 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-kn5dr" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.959200 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.969467 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9"] Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.970070 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.971556 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979025 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h66h6\" (UniqueName: \"kubernetes.io/projected/bc431866-4baf-47fc-8767-705a11b9bea0-kube-api-access-h66h6\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979064 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-conf\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979084 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bc431866-4baf-47fc-8767-705a11b9bea0-frr-startup\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979111 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-metrics\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979137 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-reloader\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.979153 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-sockets\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.980619 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:22 crc kubenswrapper[4720]: I0121 14:42:22.996671 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081431 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h66h6\" (UniqueName: \"kubernetes.io/projected/bc431866-4baf-47fc-8767-705a11b9bea0-kube-api-access-h66h6\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081479 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-conf\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081507 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bc431866-4baf-47fc-8767-705a11b9bea0-frr-startup\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081547 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-metrics\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-reloader\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081600 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-sockets\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081668 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.081697 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7tpg\" (UniqueName: \"kubernetes.io/projected/8ba45f1e-4559-4408-b129-b061d406fce6-kube-api-access-v7tpg\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.082327 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-conf\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083057 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/bc431866-4baf-47fc-8767-705a11b9bea0-frr-startup\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083242 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-metrics\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083402 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-reloader\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.083563 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/bc431866-4baf-47fc-8767-705a11b9bea0-frr-sockets\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.083699 4720 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.083743 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs podName:bc431866-4baf-47fc-8767-705a11b9bea0 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.583727044 +0000 UTC m=+781.492466966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs") pod "frr-k8s-ldp4q" (UID: "bc431866-4baf-47fc-8767-705a11b9bea0") : secret "frr-k8s-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.090222 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-m7fv6"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.091061 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.095500 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.095681 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.096977 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.105815 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ghmgt" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.116609 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-72sfn"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.129292 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.147228 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.191568 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h66h6\" (UniqueName: \"kubernetes.io/projected/bc431866-4baf-47fc-8767-705a11b9bea0-kube-api-access-h66h6\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.209793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218020 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7tpg\" (UniqueName: \"kubernetes.io/projected/8ba45f1e-4559-4408-b129-b061d406fce6-kube-api-access-v7tpg\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218162 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metallb-excludel2\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218317 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218446 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.218642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c45bt\" (UniqueName: \"kubernetes.io/projected/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-kube-api-access-c45bt\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.219347 4720 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.220272 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert podName:8ba45f1e-4559-4408-b129-b061d406fce6 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.720252018 +0000 UTC m=+781.628991950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert") pod "frr-k8s-webhook-server-7df86c4f6c-lsrs9" (UID: "8ba45f1e-4559-4408-b129-b061d406fce6") : secret "frr-k8s-webhook-server-cert" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.226366 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-72sfn"] Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.251491 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7tpg\" (UniqueName: \"kubernetes.io/projected/8ba45f1e-4559-4408-b129-b061d406fce6-kube-api-access-v7tpg\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324745 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpchb\" (UniqueName: \"kubernetes.io/projected/51379103-8c08-45c6-a0f3-86928d43bd50-kube-api-access-lpchb\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324809 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324838 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-cert\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324858 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metallb-excludel2\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324896 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.324919 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c45bt\" (UniqueName: \"kubernetes.io/projected/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-kube-api-access-c45bt\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.325735 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metallb-excludel2\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.325874 4720 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.325977 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs podName:49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.825952734 +0000 UTC m=+781.734692666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs") pod "speaker-m7fv6" (UID: "49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1") : secret "speaker-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.326138 4720 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.326475 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist podName:49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.826452687 +0000 UTC m=+781.735192619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist") pod "speaker-m7fv6" (UID: "49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1") : secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.376722 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c45bt\" (UniqueName: \"kubernetes.io/projected/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-kube-api-access-c45bt\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.425800 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-cert\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.426024 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpchb\" (UniqueName: \"kubernetes.io/projected/51379103-8c08-45c6-a0f3-86928d43bd50-kube-api-access-lpchb\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.426089 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.426258 4720 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.426327 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs podName:51379103-8c08-45c6-a0f3-86928d43bd50 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:23.926310693 +0000 UTC m=+781.835050635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs") pod "controller-6968d8fdc4-72sfn" (UID: "51379103-8c08-45c6-a0f3-86928d43bd50") : secret "controller-certs-secret" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.428463 4720 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.444129 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-cert\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.447304 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpchb\" (UniqueName: \"kubernetes.io/projected/51379103-8c08-45c6-a0f3-86928d43bd50-kube-api-access-lpchb\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.628538 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.632221 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc431866-4baf-47fc-8767-705a11b9bea0-metrics-certs\") pod \"frr-k8s-ldp4q\" (UID: \"bc431866-4baf-47fc-8767-705a11b9bea0\") " pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.729759 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.733912 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ba45f1e-4559-4408-b129-b061d406fce6-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-lsrs9\" (UID: \"8ba45f1e-4559-4408-b129-b061d406fce6\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.831094 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.831146 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.831326 4720 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: E0121 14:42:23.831378 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist podName:49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1 nodeName:}" failed. No retries permitted until 2026-01-21 14:42:24.831365104 +0000 UTC m=+782.740105026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist") pod "speaker-m7fv6" (UID: "49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1") : secret "metallb-memberlist" not found Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.835332 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-metrics-certs\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.869387 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.883952 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.932876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:23 crc kubenswrapper[4720]: I0121 14:42:23.938041 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51379103-8c08-45c6-a0f3-86928d43bd50-metrics-certs\") pod \"controller-6968d8fdc4-72sfn\" (UID: \"51379103-8c08-45c6-a0f3-86928d43bd50\") " pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.132279 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9"] Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.193200 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.414760 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-72sfn"] Jan 21 14:42:24 crc kubenswrapper[4720]: W0121 14:42:24.424844 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51379103_8c08_45c6_a0f3_86928d43bd50.slice/crio-f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38 WatchSource:0}: Error finding container f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38: Status 404 returned error can't find the container with id f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38 Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.778369 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" event={"ID":"8ba45f1e-4559-4408-b129-b061d406fce6","Type":"ContainerStarted","Data":"c5f0dce0886a0f45f4d580530432e224be426dfd5dc9636c8c812544243b96be"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780008 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-72sfn" event={"ID":"51379103-8c08-45c6-a0f3-86928d43bd50","Type":"ContainerStarted","Data":"033635c4cde1f6b14605870b40852ebc1a586337c5021d0360ca4027e4cf5165"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780074 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-72sfn" event={"ID":"51379103-8c08-45c6-a0f3-86928d43bd50","Type":"ContainerStarted","Data":"69893f24dcc0456f382d452250a564ffa504b03705c1e4a3f37dbd144ceeb914"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-72sfn" event={"ID":"51379103-8c08-45c6-a0f3-86928d43bd50","Type":"ContainerStarted","Data":"f76b5a3f05a83026e644970664cec8c56d4acec25f0141a6aca4954c12324e38"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.780132 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.781059 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"c0c41611c7145e5b0e0ea75fd6f73d19e649dbcc38ead343c57f81de823d7f75"} Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.842296 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.852709 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1-memberlist\") pod \"speaker-m7fv6\" (UID: \"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1\") " pod="metallb-system/speaker-m7fv6" Jan 21 14:42:24 crc kubenswrapper[4720]: I0121 14:42:24.907939 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.791772 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m7fv6" event={"ID":"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1","Type":"ContainerStarted","Data":"f0a13214bb75291d97e3af3d605824f46fd8f1c24699d17f7ab7aae38b99e5c7"} Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.792313 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m7fv6" event={"ID":"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1","Type":"ContainerStarted","Data":"c1c8e81afb098b499c69cb88296ce85e15eb366a8a357a1195b822ae21081807"} Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.792329 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-m7fv6" event={"ID":"49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1","Type":"ContainerStarted","Data":"7a51f48a672401516059e87e6889f5918b5cd9e7bd84ffdb833a30530489e5b0"} Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.792605 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.832220 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-m7fv6" podStartSLOduration=2.832188617 podStartE2EDuration="2.832188617s" podCreationTimestamp="2026-01-21 14:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:42:25.830532191 +0000 UTC m=+783.739272143" watchObservedRunningTime="2026-01-21 14:42:25.832188617 +0000 UTC m=+783.740928549" Jan 21 14:42:25 crc kubenswrapper[4720]: I0121 14:42:25.835883 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-72sfn" podStartSLOduration=2.8358723660000003 podStartE2EDuration="2.835872366s" podCreationTimestamp="2026-01-21 14:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:42:24.80642161 +0000 UTC m=+782.715161542" watchObservedRunningTime="2026-01-21 14:42:25.835872366 +0000 UTC m=+783.744612298" Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.848232 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" event={"ID":"8ba45f1e-4559-4408-b129-b061d406fce6","Type":"ContainerStarted","Data":"45ea7c6bfa6c49e5078825ca8fc3138247ff6c611784682a78cfd6b20c23e07c"} Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.848943 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.850373 4720 generic.go:334] "Generic (PLEG): container finished" podID="bc431866-4baf-47fc-8767-705a11b9bea0" containerID="8cbfd64e143a4d969991c6c3868408621d232046abcb9794c8ba70ac78af5ad7" exitCode=0 Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.850440 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerDied","Data":"8cbfd64e143a4d969991c6c3868408621d232046abcb9794c8ba70ac78af5ad7"} Jan 21 14:42:33 crc kubenswrapper[4720]: I0121 14:42:33.883723 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" podStartSLOduration=2.976383444 podStartE2EDuration="11.88369991s" podCreationTimestamp="2026-01-21 14:42:22 +0000 UTC" firstStartedPulling="2026-01-21 14:42:24.140052851 +0000 UTC m=+782.048792783" lastFinishedPulling="2026-01-21 14:42:33.047369317 +0000 UTC m=+790.956109249" observedRunningTime="2026-01-21 14:42:33.876998537 +0000 UTC m=+791.785738479" watchObservedRunningTime="2026-01-21 14:42:33.88369991 +0000 UTC m=+791.792439852" Jan 21 14:42:34 crc kubenswrapper[4720]: I0121 14:42:34.198106 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-72sfn" Jan 21 14:42:34 crc kubenswrapper[4720]: I0121 14:42:34.858070 4720 generic.go:334] "Generic (PLEG): container finished" podID="bc431866-4baf-47fc-8767-705a11b9bea0" containerID="d875c689cd516950e8c727164aa1f83a699878b57d4975b5fcdb4e30cfc35b48" exitCode=0 Jan 21 14:42:34 crc kubenswrapper[4720]: I0121 14:42:34.858422 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerDied","Data":"d875c689cd516950e8c727164aa1f83a699878b57d4975b5fcdb4e30cfc35b48"} Jan 21 14:42:35 crc kubenswrapper[4720]: I0121 14:42:35.867787 4720 generic.go:334] "Generic (PLEG): container finished" podID="bc431866-4baf-47fc-8767-705a11b9bea0" containerID="c3f81f517600b572281fe35f7c14ad55fb9dedc54f5d96394ef7c9e491415f51" exitCode=0 Jan 21 14:42:35 crc kubenswrapper[4720]: I0121 14:42:35.867833 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerDied","Data":"c3f81f517600b572281fe35f7c14ad55fb9dedc54f5d96394ef7c9e491415f51"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884381 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"75f094c831763e16abd57d3786b4b507697b41e0827e145b51ae0232971ced2f"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884729 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"e4b8cfc74428268a6890248629bde07d7690f7af2a1130a3f58fa6e28ae3c8f8"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884742 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"44b91dda3a8f5c829dc40c325f4a8638311d1721889cd2f36669f3e53dd12984"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884752 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"dd13d9feb684e2f848b5881c989af0162eda63aa8d01ac6762117052094583e8"} Jan 21 14:42:37 crc kubenswrapper[4720]: I0121 14:42:37.884762 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"4f5daa2d322c35a72fc0343ceffe578662539dc1c840ade647e38eaef13cb105"} Jan 21 14:42:38 crc kubenswrapper[4720]: I0121 14:42:38.896706 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-ldp4q" event={"ID":"bc431866-4baf-47fc-8767-705a11b9bea0","Type":"ContainerStarted","Data":"ebdc542055b717ea624c9c4707f124a6fde6fe9a4cec20285ab032ded8165957"} Jan 21 14:42:38 crc kubenswrapper[4720]: I0121 14:42:38.897056 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:38 crc kubenswrapper[4720]: I0121 14:42:38.924241 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-ldp4q" podStartSLOduration=7.993178997 podStartE2EDuration="16.924222688s" podCreationTimestamp="2026-01-21 14:42:22 +0000 UTC" firstStartedPulling="2026-01-21 14:42:24.155783689 +0000 UTC m=+782.064523621" lastFinishedPulling="2026-01-21 14:42:33.08682738 +0000 UTC m=+790.995567312" observedRunningTime="2026-01-21 14:42:38.92170771 +0000 UTC m=+796.830447672" watchObservedRunningTime="2026-01-21 14:42:38.924222688 +0000 UTC m=+796.832962620" Jan 21 14:42:43 crc kubenswrapper[4720]: I0121 14:42:43.869931 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:43 crc kubenswrapper[4720]: I0121 14:42:43.888382 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-lsrs9" Jan 21 14:42:43 crc kubenswrapper[4720]: I0121 14:42:43.918307 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:44 crc kubenswrapper[4720]: I0121 14:42:44.911759 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-m7fv6" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.263803 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.265291 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.272924 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.367960 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.368029 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.368559 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.469961 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470034 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470067 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470547 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.470699 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.499884 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"redhat-marketplace-j5gmc\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.579407 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.830898 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:45 crc kubenswrapper[4720]: I0121 14:42:45.939132 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerStarted","Data":"e6b964d8585fa7b58ed6a33e8427e5c8bd074516b39249e500cc2ba8378e5880"} Jan 21 14:42:46 crc kubenswrapper[4720]: I0121 14:42:46.945673 4720 generic.go:334] "Generic (PLEG): container finished" podID="00a18014-031d-42cb-b5b2-c9114b70f910" containerID="b975f06fb33742d6dda57bac42eeac59eb01583c78d1824aeaf4afd4891cc411" exitCode=0 Jan 21 14:42:46 crc kubenswrapper[4720]: I0121 14:42:46.945730 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"b975f06fb33742d6dda57bac42eeac59eb01583c78d1824aeaf4afd4891cc411"} Jan 21 14:42:47 crc kubenswrapper[4720]: I0121 14:42:47.952471 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerStarted","Data":"c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42"} Jan 21 14:42:48 crc kubenswrapper[4720]: I0121 14:42:48.959475 4720 generic.go:334] "Generic (PLEG): container finished" podID="00a18014-031d-42cb-b5b2-c9114b70f910" containerID="c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42" exitCode=0 Jan 21 14:42:48 crc kubenswrapper[4720]: I0121 14:42:48.959535 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42"} Jan 21 14:42:49 crc kubenswrapper[4720]: I0121 14:42:49.968677 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerStarted","Data":"68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d"} Jan 21 14:42:49 crc kubenswrapper[4720]: I0121 14:42:49.992802 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j5gmc" podStartSLOduration=2.38661926 podStartE2EDuration="4.992787581s" podCreationTimestamp="2026-01-21 14:42:45 +0000 UTC" firstStartedPulling="2026-01-21 14:42:46.947770342 +0000 UTC m=+804.856510274" lastFinishedPulling="2026-01-21 14:42:49.553938673 +0000 UTC m=+807.462678595" observedRunningTime="2026-01-21 14:42:49.987359954 +0000 UTC m=+807.896099956" watchObservedRunningTime="2026-01-21 14:42:49.992787581 +0000 UTC m=+807.901527513" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.834311 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.835213 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.837709 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-sn7bq" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.838118 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.840518 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.846618 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:51 crc kubenswrapper[4720]: I0121 14:42:51.973360 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"openstack-operator-index-rgdp7\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.075372 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"openstack-operator-index-rgdp7\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.094451 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"openstack-operator-index-rgdp7\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.153410 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.350792 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.880746 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.881076 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:42:52 crc kubenswrapper[4720]: I0121 14:42:52.986455 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerStarted","Data":"1d71d45e77d980126f90f6eb7ecc522605b8c38e60ba5198007f7f8090fab602"} Jan 21 14:42:53 crc kubenswrapper[4720]: I0121 14:42:53.872984 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-ldp4q" Jan 21 14:42:55 crc kubenswrapper[4720]: I0121 14:42:55.579780 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:55 crc kubenswrapper[4720]: I0121 14:42:55.580111 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:55 crc kubenswrapper[4720]: I0121 14:42:55.638901 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:56 crc kubenswrapper[4720]: I0121 14:42:56.071643 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:42:56 crc kubenswrapper[4720]: I0121 14:42:56.829982 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.033132 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.633934 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-j4xn9"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.634675 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.650070 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j4xn9"] Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.752064 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkdj7\" (UniqueName: \"kubernetes.io/projected/5d59157d-f538-4cb0-959d-11584d7678c5-kube-api-access-vkdj7\") pod \"openstack-operator-index-j4xn9\" (UID: \"5d59157d-f538-4cb0-959d-11584d7678c5\") " pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.853177 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkdj7\" (UniqueName: \"kubernetes.io/projected/5d59157d-f538-4cb0-959d-11584d7678c5-kube-api-access-vkdj7\") pod \"openstack-operator-index-j4xn9\" (UID: \"5d59157d-f538-4cb0-959d-11584d7678c5\") " pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.880147 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkdj7\" (UniqueName: \"kubernetes.io/projected/5d59157d-f538-4cb0-959d-11584d7678c5-kube-api-access-vkdj7\") pod \"openstack-operator-index-j4xn9\" (UID: \"5d59157d-f538-4cb0-959d-11584d7678c5\") " pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:57 crc kubenswrapper[4720]: I0121 14:42:57.956812 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:42:58 crc kubenswrapper[4720]: I0121 14:42:58.022919 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j5gmc" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" containerID="cri-o://68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d" gracePeriod=2 Jan 21 14:42:58 crc kubenswrapper[4720]: I0121 14:42:58.398473 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-j4xn9"] Jan 21 14:42:58 crc kubenswrapper[4720]: W0121 14:42:58.406577 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d59157d_f538_4cb0_959d_11584d7678c5.slice/crio-cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a WatchSource:0}: Error finding container cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a: Status 404 returned error can't find the container with id cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a Jan 21 14:42:59 crc kubenswrapper[4720]: I0121 14:42:59.034717 4720 generic.go:334] "Generic (PLEG): container finished" podID="00a18014-031d-42cb-b5b2-c9114b70f910" containerID="68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d" exitCode=0 Jan 21 14:42:59 crc kubenswrapper[4720]: I0121 14:42:59.034726 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d"} Jan 21 14:42:59 crc kubenswrapper[4720]: I0121 14:42:59.036999 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j4xn9" event={"ID":"5d59157d-f538-4cb0-959d-11584d7678c5","Type":"ContainerStarted","Data":"cf258f1812b1296d33db5bf6f5c41c4b001b5de2d627d9afac40a23184e4922a"} Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.191716 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.289580 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") pod \"00a18014-031d-42cb-b5b2-c9114b70f910\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.290014 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") pod \"00a18014-031d-42cb-b5b2-c9114b70f910\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.290097 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") pod \"00a18014-031d-42cb-b5b2-c9114b70f910\" (UID: \"00a18014-031d-42cb-b5b2-c9114b70f910\") " Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.291021 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities" (OuterVolumeSpecName: "utilities") pod "00a18014-031d-42cb-b5b2-c9114b70f910" (UID: "00a18014-031d-42cb-b5b2-c9114b70f910"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.295196 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv" (OuterVolumeSpecName: "kube-api-access-8nqhv") pod "00a18014-031d-42cb-b5b2-c9114b70f910" (UID: "00a18014-031d-42cb-b5b2-c9114b70f910"). InnerVolumeSpecName "kube-api-access-8nqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.329388 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00a18014-031d-42cb-b5b2-c9114b70f910" (UID: "00a18014-031d-42cb-b5b2-c9114b70f910"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.391273 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.391326 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a18014-031d-42cb-b5b2-c9114b70f910-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:00 crc kubenswrapper[4720]: I0121 14:43:00.391346 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nqhv\" (UniqueName: \"kubernetes.io/projected/00a18014-031d-42cb-b5b2-c9114b70f910-kube-api-access-8nqhv\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.051799 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j5gmc" event={"ID":"00a18014-031d-42cb-b5b2-c9114b70f910","Type":"ContainerDied","Data":"e6b964d8585fa7b58ed6a33e8427e5c8bd074516b39249e500cc2ba8378e5880"} Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.051845 4720 scope.go:117] "RemoveContainer" containerID="68336fe93cda775095a1fc141e22ec482346446b1afe63b04f8eb00f2e53d32d" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.051950 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j5gmc" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.077894 4720 scope.go:117] "RemoveContainer" containerID="c9e505501da00ddaabf3c07df03f508ccad5e3d4eed18bcb9fba921f9836aa42" Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.087782 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.113287 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j5gmc"] Jan 21 14:43:01 crc kubenswrapper[4720]: I0121 14:43:01.121612 4720 scope.go:117] "RemoveContainer" containerID="b975f06fb33742d6dda57bac42eeac59eb01583c78d1824aeaf4afd4891cc411" Jan 21 14:43:02 crc kubenswrapper[4720]: I0121 14:43:02.690018 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" path="/var/lib/kubelet/pods/00a18014-031d-42cb-b5b2-c9114b70f910/volumes" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436038 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:06 crc kubenswrapper[4720]: E0121 14:43:06.436867 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-utilities" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436897 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-utilities" Jan 21 14:43:06 crc kubenswrapper[4720]: E0121 14:43:06.436924 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-content" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436939 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="extract-content" Jan 21 14:43:06 crc kubenswrapper[4720]: E0121 14:43:06.436974 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.436990 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.437253 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a18014-031d-42cb-b5b2-c9114b70f910" containerName="registry-server" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.439038 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.495470 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.505516 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.505633 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.505870 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607103 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607188 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607228 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607735 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.607914 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.624599 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"certified-operators-dqgp6\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:06 crc kubenswrapper[4720]: I0121 14:43:06.771905 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:14 crc kubenswrapper[4720]: I0121 14:43:14.418412 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.143066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-j4xn9" event={"ID":"5d59157d-f538-4cb0-959d-11584d7678c5","Type":"ContainerStarted","Data":"6b19eeaefd46f15129fe9c2ac59824dc6c5406747555ecf52061bf222fc35d2a"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.145141 4720 generic.go:334] "Generic (PLEG): container finished" podID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" exitCode=0 Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.145201 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.145226 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerStarted","Data":"cf6301ab54f5e2a0499597c73edfe33ce7ec8e78073269084732953f390b0d82"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.147942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerStarted","Data":"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab"} Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.148456 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rgdp7" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" containerID="cri-o://0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" gracePeriod=2 Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.165011 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-j4xn9" podStartSLOduration=2.361470482 podStartE2EDuration="18.164988981s" podCreationTimestamp="2026-01-21 14:42:57 +0000 UTC" firstStartedPulling="2026-01-21 14:42:58.409094662 +0000 UTC m=+816.317834604" lastFinishedPulling="2026-01-21 14:43:14.212613171 +0000 UTC m=+832.121353103" observedRunningTime="2026-01-21 14:43:15.160498158 +0000 UTC m=+833.069238090" watchObservedRunningTime="2026-01-21 14:43:15.164988981 +0000 UTC m=+833.073728913" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.201801 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rgdp7" podStartSLOduration=2.353681916 podStartE2EDuration="24.201784282s" podCreationTimestamp="2026-01-21 14:42:51 +0000 UTC" firstStartedPulling="2026-01-21 14:42:52.357448293 +0000 UTC m=+810.266188215" lastFinishedPulling="2026-01-21 14:43:14.205550649 +0000 UTC m=+832.114290581" observedRunningTime="2026-01-21 14:43:15.199058488 +0000 UTC m=+833.107798410" watchObservedRunningTime="2026-01-21 14:43:15.201784282 +0000 UTC m=+833.110524214" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.496340 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.518164 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") pod \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\" (UID: \"f29e3816-0dc6-4e24-80ce-3f0669a92a8a\") " Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.523604 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7" (OuterVolumeSpecName: "kube-api-access-m4kz7") pod "f29e3816-0dc6-4e24-80ce-3f0669a92a8a" (UID: "f29e3816-0dc6-4e24-80ce-3f0669a92a8a"). InnerVolumeSpecName "kube-api-access-m4kz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:15 crc kubenswrapper[4720]: I0121 14:43:15.619757 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4kz7\" (UniqueName: \"kubernetes.io/projected/f29e3816-0dc6-4e24-80ce-3f0669a92a8a-kube-api-access-m4kz7\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154690 4720 generic.go:334] "Generic (PLEG): container finished" podID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" exitCode=0 Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154772 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rgdp7" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154785 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerDied","Data":"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab"} Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154818 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rgdp7" event={"ID":"f29e3816-0dc6-4e24-80ce-3f0669a92a8a","Type":"ContainerDied","Data":"1d71d45e77d980126f90f6eb7ecc522605b8c38e60ba5198007f7f8090fab602"} Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.154838 4720 scope.go:117] "RemoveContainer" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.157816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerStarted","Data":"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9"} Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.171471 4720 scope.go:117] "RemoveContainer" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" Jan 21 14:43:16 crc kubenswrapper[4720]: E0121 14:43:16.173085 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab\": container with ID starting with 0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab not found: ID does not exist" containerID="0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.173228 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab"} err="failed to get container status \"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab\": rpc error: code = NotFound desc = could not find container \"0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab\": container with ID starting with 0a8b40a86261027ecb7d0f1eece309e11545a8aaf707a9bcbbc1d9470ab3e4ab not found: ID does not exist" Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.204571 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.212519 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rgdp7"] Jan 21 14:43:16 crc kubenswrapper[4720]: I0121 14:43:16.686241 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" path="/var/lib/kubelet/pods/f29e3816-0dc6-4e24-80ce-3f0669a92a8a/volumes" Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.167501 4720 generic.go:334] "Generic (PLEG): container finished" podID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" exitCode=0 Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.167611 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9"} Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.957180 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.957852 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:17 crc kubenswrapper[4720]: I0121 14:43:17.987086 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:18 crc kubenswrapper[4720]: I0121 14:43:18.175233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerStarted","Data":"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e"} Jan 21 14:43:18 crc kubenswrapper[4720]: I0121 14:43:18.190481 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqgp6" podStartSLOduration=9.664984352 podStartE2EDuration="12.190463039s" podCreationTimestamp="2026-01-21 14:43:06 +0000 UTC" firstStartedPulling="2026-01-21 14:43:15.147805673 +0000 UTC m=+833.056545615" lastFinishedPulling="2026-01-21 14:43:17.67328435 +0000 UTC m=+835.582024302" observedRunningTime="2026-01-21 14:43:18.189931875 +0000 UTC m=+836.098671817" watchObservedRunningTime="2026-01-21 14:43:18.190463039 +0000 UTC m=+836.099202981" Jan 21 14:43:19 crc kubenswrapper[4720]: I0121 14:43:19.204255 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-j4xn9" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.873501 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g"] Jan 21 14:43:21 crc kubenswrapper[4720]: E0121 14:43:21.873886 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.873904 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.874058 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29e3816-0dc6-4e24-80ce-3f0669a92a8a" containerName="registry-server" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.875053 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.877078 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zr58t" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.891026 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g"] Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.901346 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.901407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:21 crc kubenswrapper[4720]: I0121 14:43:21.901446 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002413 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002456 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002494 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002935 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.002964 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.027035 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.192675 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.654506 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g"] Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.880464 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.880528 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.880588 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.881229 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:43:22 crc kubenswrapper[4720]: I0121 14:43:22.881282 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47" gracePeriod=600 Jan 21 14:43:23 crc kubenswrapper[4720]: I0121 14:43:23.218550 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerStarted","Data":"e514908471d3b967ee61abab72dafcdb91214af2ee11cfe412af578161525e5a"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.226385 4720 generic.go:334] "Generic (PLEG): container finished" podID="533f904c-bfa5-42e7-a907-5fe372443d20" containerID="7156339bbe2ef44793d9c82fb8ad6b72ea97b6fcf7156ca0cf6c708c18ca8d2e" exitCode=0 Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.226473 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"7156339bbe2ef44793d9c82fb8ad6b72ea97b6fcf7156ca0cf6c708c18ca8d2e"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230340 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47" exitCode=0 Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230368 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230388 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6"} Jan 21 14:43:24 crc kubenswrapper[4720]: I0121 14:43:24.230403 4720 scope.go:117] "RemoveContainer" containerID="a61755d2d50927cd3c032bcad351e940f76beb15aa30f49c45cc8f2e261c405c" Jan 21 14:43:25 crc kubenswrapper[4720]: I0121 14:43:25.243834 4720 generic.go:334] "Generic (PLEG): container finished" podID="533f904c-bfa5-42e7-a907-5fe372443d20" containerID="b4ef023ecea2b77b19d79870d43876fa179747ce0c6ae26e6cdf987696ba54d6" exitCode=0 Jan 21 14:43:25 crc kubenswrapper[4720]: I0121 14:43:25.243921 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"b4ef023ecea2b77b19d79870d43876fa179747ce0c6ae26e6cdf987696ba54d6"} Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.253022 4720 generic.go:334] "Generic (PLEG): container finished" podID="533f904c-bfa5-42e7-a907-5fe372443d20" containerID="c7076facda3e1c647765188f6a34fd22ef222fa1b3f88427381505c6288daf09" exitCode=0 Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.253238 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"c7076facda3e1c647765188f6a34fd22ef222fa1b3f88427381505c6288daf09"} Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.772432 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.772480 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:26 crc kubenswrapper[4720]: I0121 14:43:26.813019 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.303285 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.498370 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.690216 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") pod \"533f904c-bfa5-42e7-a907-5fe372443d20\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.690297 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") pod \"533f904c-bfa5-42e7-a907-5fe372443d20\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.690397 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") pod \"533f904c-bfa5-42e7-a907-5fe372443d20\" (UID: \"533f904c-bfa5-42e7-a907-5fe372443d20\") " Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.691313 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle" (OuterVolumeSpecName: "bundle") pod "533f904c-bfa5-42e7-a907-5fe372443d20" (UID: "533f904c-bfa5-42e7-a907-5fe372443d20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.695189 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq" (OuterVolumeSpecName: "kube-api-access-bd6xq") pod "533f904c-bfa5-42e7-a907-5fe372443d20" (UID: "533f904c-bfa5-42e7-a907-5fe372443d20"). InnerVolumeSpecName "kube-api-access-bd6xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.715795 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util" (OuterVolumeSpecName: "util") pod "533f904c-bfa5-42e7-a907-5fe372443d20" (UID: "533f904c-bfa5-42e7-a907-5fe372443d20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.791911 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd6xq\" (UniqueName: \"kubernetes.io/projected/533f904c-bfa5-42e7-a907-5fe372443d20-kube-api-access-bd6xq\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.791957 4720 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:27 crc kubenswrapper[4720]: I0121 14:43:27.791976 4720 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/533f904c-bfa5-42e7-a907-5fe372443d20-util\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:28 crc kubenswrapper[4720]: I0121 14:43:28.268182 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" Jan 21 14:43:28 crc kubenswrapper[4720]: I0121 14:43:28.268351 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g" event={"ID":"533f904c-bfa5-42e7-a907-5fe372443d20","Type":"ContainerDied","Data":"e514908471d3b967ee61abab72dafcdb91214af2ee11cfe412af578161525e5a"} Jan 21 14:43:28 crc kubenswrapper[4720]: I0121 14:43:28.269122 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e514908471d3b967ee61abab72dafcdb91214af2ee11cfe412af578161525e5a" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.227092 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.272440 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqgp6" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" containerID="cri-o://b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" gracePeriod=2 Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.789330 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.830085 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") pod \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.830195 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") pod \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.830286 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") pod \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\" (UID: \"81ab0c5a-1ce9-47b5-aa19-ff309d2da011\") " Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.839619 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities" (OuterVolumeSpecName: "utilities") pod "81ab0c5a-1ce9-47b5-aa19-ff309d2da011" (UID: "81ab0c5a-1ce9-47b5-aa19-ff309d2da011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.844361 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs" (OuterVolumeSpecName: "kube-api-access-5fgxs") pod "81ab0c5a-1ce9-47b5-aa19-ff309d2da011" (UID: "81ab0c5a-1ce9-47b5-aa19-ff309d2da011"). InnerVolumeSpecName "kube-api-access-5fgxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.891949 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81ab0c5a-1ce9-47b5-aa19-ff309d2da011" (UID: "81ab0c5a-1ce9-47b5-aa19-ff309d2da011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.932108 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.932143 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fgxs\" (UniqueName: \"kubernetes.io/projected/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-kube-api-access-5fgxs\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:29 crc kubenswrapper[4720]: I0121 14:43:29.932154 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81ab0c5a-1ce9-47b5-aa19-ff309d2da011-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284595 4720 generic.go:334] "Generic (PLEG): container finished" podID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" exitCode=0 Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284645 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e"} Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284729 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqgp6" event={"ID":"81ab0c5a-1ce9-47b5-aa19-ff309d2da011","Type":"ContainerDied","Data":"cf6301ab54f5e2a0499597c73edfe33ce7ec8e78073269084732953f390b0d82"} Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284750 4720 scope.go:117] "RemoveContainer" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.284774 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqgp6" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.313864 4720 scope.go:117] "RemoveContainer" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.334830 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.338582 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqgp6"] Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.347853 4720 scope.go:117] "RemoveContainer" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.367192 4720 scope.go:117] "RemoveContainer" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" Jan 21 14:43:30 crc kubenswrapper[4720]: E0121 14:43:30.367644 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e\": container with ID starting with b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e not found: ID does not exist" containerID="b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.367770 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e"} err="failed to get container status \"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e\": rpc error: code = NotFound desc = could not find container \"b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e\": container with ID starting with b3f99d4f2ce2bbfe7e8fc446a2db45eb7f41e089f226ab025d38620841376b4e not found: ID does not exist" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.367812 4720 scope.go:117] "RemoveContainer" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" Jan 21 14:43:30 crc kubenswrapper[4720]: E0121 14:43:30.368224 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9\": container with ID starting with 4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9 not found: ID does not exist" containerID="4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.368259 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9"} err="failed to get container status \"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9\": rpc error: code = NotFound desc = could not find container \"4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9\": container with ID starting with 4ab987573bcd60c27d8b0864e21351f96decf9ea681813ab599d3092013563e9 not found: ID does not exist" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.368292 4720 scope.go:117] "RemoveContainer" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" Jan 21 14:43:30 crc kubenswrapper[4720]: E0121 14:43:30.368879 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a\": container with ID starting with f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a not found: ID does not exist" containerID="f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.368911 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a"} err="failed to get container status \"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a\": rpc error: code = NotFound desc = could not find container \"f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a\": container with ID starting with f3185b287551199007386d54d15929dc7f16a36bafd3fbd778ccdd60bd84304a not found: ID does not exist" Jan 21 14:43:30 crc kubenswrapper[4720]: I0121 14:43:30.687003 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" path="/var/lib/kubelet/pods/81ab0c5a-1ce9-47b5-aa19-ff309d2da011/volumes" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368610 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn"] Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368861 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="extract" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368874 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="extract" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368893 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="util" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368901 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="util" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368913 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368921 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368935 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-utilities" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368943 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-utilities" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368958 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="pull" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368966 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="pull" Jan 21 14:43:32 crc kubenswrapper[4720]: E0121 14:43:32.368980 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-content" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.368987 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="extract-content" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.369139 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="533f904c-bfa5-42e7-a907-5fe372443d20" containerName="extract" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.369154 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="81ab0c5a-1ce9-47b5-aa19-ff309d2da011" containerName="registry-server" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.369581 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.371432 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-4ngth" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.452181 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn"] Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.463992 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/d3800217-b53a-4788-a9d4-8861cfdb68a1-kube-api-access-mth5q\") pod \"openstack-operator-controller-init-68fc899677-pbmmn\" (UID: \"d3800217-b53a-4788-a9d4-8861cfdb68a1\") " pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.565984 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/d3800217-b53a-4788-a9d4-8861cfdb68a1-kube-api-access-mth5q\") pod \"openstack-operator-controller-init-68fc899677-pbmmn\" (UID: \"d3800217-b53a-4788-a9d4-8861cfdb68a1\") " pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.585404 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mth5q\" (UniqueName: \"kubernetes.io/projected/d3800217-b53a-4788-a9d4-8861cfdb68a1-kube-api-access-mth5q\") pod \"openstack-operator-controller-init-68fc899677-pbmmn\" (UID: \"d3800217-b53a-4788-a9d4-8861cfdb68a1\") " pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.687462 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:32 crc kubenswrapper[4720]: I0121 14:43:32.991379 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn"] Jan 21 14:43:33 crc kubenswrapper[4720]: I0121 14:43:33.310383 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" event={"ID":"d3800217-b53a-4788-a9d4-8861cfdb68a1","Type":"ContainerStarted","Data":"ddd4e1059f51fd84885f344a93627688387ac371ba2ce7278799e9189a4b6973"} Jan 21 14:43:38 crc kubenswrapper[4720]: I0121 14:43:38.349390 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" event={"ID":"d3800217-b53a-4788-a9d4-8861cfdb68a1","Type":"ContainerStarted","Data":"11a85db7a64fafef1e314d06f183216c8b0d9c71c648d6691047e0a3ac0f7043"} Jan 21 14:43:38 crc kubenswrapper[4720]: I0121 14:43:38.349931 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:43:38 crc kubenswrapper[4720]: I0121 14:43:38.384076 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" podStartSLOduration=1.4094432000000001 podStartE2EDuration="6.384051129s" podCreationTimestamp="2026-01-21 14:43:32 +0000 UTC" firstStartedPulling="2026-01-21 14:43:33.007100476 +0000 UTC m=+850.915840408" lastFinishedPulling="2026-01-21 14:43:37.981708405 +0000 UTC m=+855.890448337" observedRunningTime="2026-01-21 14:43:38.37929387 +0000 UTC m=+856.288033802" watchObservedRunningTime="2026-01-21 14:43:38.384051129 +0000 UTC m=+856.292791081" Jan 21 14:43:52 crc kubenswrapper[4720]: I0121 14:43:52.691197 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-68fc899677-pbmmn" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.545038 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.546432 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.547688 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7q2\" (UniqueName: \"kubernetes.io/projected/b7ea6739-9c38-44a0-a382-8b26e37138fa-kube-api-access-fl7q2\") pod \"cinder-operator-controller-manager-9b68f5989-wnzfm\" (UID: \"b7ea6739-9c38-44a0-a382-8b26e37138fa\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.549614 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-48m8m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.561688 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.573364 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.574337 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.579917 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zn7jv" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.581295 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.582211 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.584046 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-cgt55" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.593766 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.603896 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.637556 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.638393 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649437 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twq9l\" (UniqueName: \"kubernetes.io/projected/655f8c6a-4936-45d3-9538-66ee77a050d3-kube-api-access-twq9l\") pod \"barbican-operator-controller-manager-7ddb5c749-q2t2m\" (UID: \"655f8c6a-4936-45d3-9538-66ee77a050d3\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649526 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7q2\" (UniqueName: \"kubernetes.io/projected/b7ea6739-9c38-44a0-a382-8b26e37138fa-kube-api-access-fl7q2\") pod \"cinder-operator-controller-manager-9b68f5989-wnzfm\" (UID: \"b7ea6739-9c38-44a0-a382-8b26e37138fa\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649586 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ww4p\" (UniqueName: \"kubernetes.io/projected/6c93648a-7076-4d91-ac7a-f389ab1159cc-kube-api-access-7ww4p\") pod \"glance-operator-controller-manager-c6994669c-gwlgm\" (UID: \"6c93648a-7076-4d91-ac7a-f389ab1159cc\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.649626 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2j4j\" (UniqueName: \"kubernetes.io/projected/96218341-1cf7-4aa1-bb9a-7a7abba7a93e-kube-api-access-j2j4j\") pod \"designate-operator-controller-manager-9f958b845-bjn2r\" (UID: \"96218341-1cf7-4aa1-bb9a-7a7abba7a93e\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.685391 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-88wsq" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.685533 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.696788 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7q2\" (UniqueName: \"kubernetes.io/projected/b7ea6739-9c38-44a0-a382-8b26e37138fa-kube-api-access-fl7q2\") pod \"cinder-operator-controller-manager-9b68f5989-wnzfm\" (UID: \"b7ea6739-9c38-44a0-a382-8b26e37138fa\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.745871 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.746641 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.750552 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twq9l\" (UniqueName: \"kubernetes.io/projected/655f8c6a-4936-45d3-9538-66ee77a050d3-kube-api-access-twq9l\") pod \"barbican-operator-controller-manager-7ddb5c749-q2t2m\" (UID: \"655f8c6a-4936-45d3-9538-66ee77a050d3\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.750739 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ww4p\" (UniqueName: \"kubernetes.io/projected/6c93648a-7076-4d91-ac7a-f389ab1159cc-kube-api-access-7ww4p\") pod \"glance-operator-controller-manager-c6994669c-gwlgm\" (UID: \"6c93648a-7076-4d91-ac7a-f389ab1159cc\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.750828 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2j4j\" (UniqueName: \"kubernetes.io/projected/96218341-1cf7-4aa1-bb9a-7a7abba7a93e-kube-api-access-j2j4j\") pod \"designate-operator-controller-manager-9f958b845-bjn2r\" (UID: \"96218341-1cf7-4aa1-bb9a-7a7abba7a93e\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.751486 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-clrhz" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.754533 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.755414 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.757503 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-75pgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.767745 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.774968 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.797616 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2j4j\" (UniqueName: \"kubernetes.io/projected/96218341-1cf7-4aa1-bb9a-7a7abba7a93e-kube-api-access-j2j4j\") pod \"designate-operator-controller-manager-9f958b845-bjn2r\" (UID: \"96218341-1cf7-4aa1-bb9a-7a7abba7a93e\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.800350 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twq9l\" (UniqueName: \"kubernetes.io/projected/655f8c6a-4936-45d3-9538-66ee77a050d3-kube-api-access-twq9l\") pod \"barbican-operator-controller-manager-7ddb5c749-q2t2m\" (UID: \"655f8c6a-4936-45d3-9538-66ee77a050d3\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.831580 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ww4p\" (UniqueName: \"kubernetes.io/projected/6c93648a-7076-4d91-ac7a-f389ab1159cc-kube-api-access-7ww4p\") pod \"glance-operator-controller-manager-c6994669c-gwlgm\" (UID: \"6c93648a-7076-4d91-ac7a-f389ab1159cc\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.852402 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqbcn\" (UniqueName: \"kubernetes.io/projected/071d4469-5b09-49a3-97f4-239d811825a2-kube-api-access-jqbcn\") pod \"horizon-operator-controller-manager-77d5c5b54f-vfxfh\" (UID: \"071d4469-5b09-49a3-97f4-239d811825a2\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.863993 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.866121 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.866834 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.869813 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zcxwg" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.869984 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.894037 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.897121 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.897950 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.900099 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.906012 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.906585 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.911977 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-p4v8k" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.912229 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hjtz5" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.937032 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.948363 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4"] Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.957995 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.959096 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nlh\" (UniqueName: \"kubernetes.io/projected/9a5569f7-371f-4663-b005-5fdcce36936b-kube-api-access-p2nlh\") pod \"heat-operator-controller-manager-594c8c9d5d-bl4z8\" (UID: \"9a5569f7-371f-4663-b005-5fdcce36936b\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.959136 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqbcn\" (UniqueName: \"kubernetes.io/projected/071d4469-5b09-49a3-97f4-239d811825a2-kube-api-access-jqbcn\") pod \"horizon-operator-controller-manager-77d5c5b54f-vfxfh\" (UID: \"071d4469-5b09-49a3-97f4-239d811825a2\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:13 crc kubenswrapper[4720]: I0121 14:44:13.994137 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.028099 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqbcn\" (UniqueName: \"kubernetes.io/projected/071d4469-5b09-49a3-97f4-239d811825a2-kube-api-access-jqbcn\") pod \"horizon-operator-controller-manager-77d5c5b54f-vfxfh\" (UID: \"071d4469-5b09-49a3-97f4-239d811825a2\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.043888 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.044592 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.061560 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-qfvft" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.062035 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.062976 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063001 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8fx5\" (UniqueName: \"kubernetes.io/projected/085a2e93-1496-47f3-a7dc-4acae2e201fc-kube-api-access-m8fx5\") pod \"keystone-operator-controller-manager-767fdc4f47-54hwg\" (UID: \"085a2e93-1496-47f3-a7dc-4acae2e201fc\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063033 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2dp\" (UniqueName: \"kubernetes.io/projected/9b467fa8-1984-4659-8873-99c20204b16b-kube-api-access-pn2dp\") pod \"ironic-operator-controller-manager-78757b4889-glbt4\" (UID: \"9b467fa8-1984-4659-8873-99c20204b16b\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwk8l\" (UniqueName: \"kubernetes.io/projected/b80cffaf-5853-47ac-b783-c26da64425ff-kube-api-access-lwk8l\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.063342 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nlh\" (UniqueName: \"kubernetes.io/projected/9a5569f7-371f-4663-b005-5fdcce36936b-kube-api-access-p2nlh\") pod \"heat-operator-controller-manager-594c8c9d5d-bl4z8\" (UID: \"9a5569f7-371f-4663-b005-5fdcce36936b\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.068048 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.068883 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.093032 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jq5jw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.142488 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nlh\" (UniqueName: \"kubernetes.io/projected/9a5569f7-371f-4663-b005-5fdcce36936b-kube-api-access-p2nlh\") pod \"heat-operator-controller-manager-594c8c9d5d-bl4z8\" (UID: \"9a5569f7-371f-4663-b005-5fdcce36936b\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.144728 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.145748 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210332 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210387 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8fx5\" (UniqueName: \"kubernetes.io/projected/085a2e93-1496-47f3-a7dc-4acae2e201fc-kube-api-access-m8fx5\") pod \"keystone-operator-controller-manager-767fdc4f47-54hwg\" (UID: \"085a2e93-1496-47f3-a7dc-4acae2e201fc\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210430 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2dp\" (UniqueName: \"kubernetes.io/projected/9b467fa8-1984-4659-8873-99c20204b16b-kube-api-access-pn2dp\") pod \"ironic-operator-controller-manager-78757b4889-glbt4\" (UID: \"9b467fa8-1984-4659-8873-99c20204b16b\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210458 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cv4\" (UniqueName: \"kubernetes.io/projected/589a442f-27a6-4d23-85dd-9e5b1556363f-kube-api-access-t4cv4\") pod \"mariadb-operator-controller-manager-c87fff755-v4fbm\" (UID: \"589a442f-27a6-4d23-85dd-9e5b1556363f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.210493 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwk8l\" (UniqueName: \"kubernetes.io/projected/b80cffaf-5853-47ac-b783-c26da64425ff-kube-api-access-lwk8l\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.210936 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.210987 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:14.710971345 +0000 UTC m=+892.619711277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.225087 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-vqlzt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.231623 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.290623 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.295247 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8fx5\" (UniqueName: \"kubernetes.io/projected/085a2e93-1496-47f3-a7dc-4acae2e201fc-kube-api-access-m8fx5\") pod \"keystone-operator-controller-manager-767fdc4f47-54hwg\" (UID: \"085a2e93-1496-47f3-a7dc-4acae2e201fc\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.308051 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwk8l\" (UniqueName: \"kubernetes.io/projected/b80cffaf-5853-47ac-b783-c26da64425ff-kube-api-access-lwk8l\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.311016 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.313260 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kjvw\" (UniqueName: \"kubernetes.io/projected/c38df2a4-6626-4b71-9dcd-7ef3003ee693-kube-api-access-5kjvw\") pod \"neutron-operator-controller-manager-cb4666565-d22bk\" (UID: \"c38df2a4-6626-4b71-9dcd-7ef3003ee693\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.313316 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gkdp\" (UniqueName: \"kubernetes.io/projected/370e5a87-5edf-4d48-9b65-335400a84cd2-kube-api-access-7gkdp\") pod \"manila-operator-controller-manager-864f6b75bf-n5bwd\" (UID: \"370e5a87-5edf-4d48-9b65-335400a84cd2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.313351 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cv4\" (UniqueName: \"kubernetes.io/projected/589a442f-27a6-4d23-85dd-9e5b1556363f-kube-api-access-t4cv4\") pod \"mariadb-operator-controller-manager-c87fff755-v4fbm\" (UID: \"589a442f-27a6-4d23-85dd-9e5b1556363f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.314102 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2dp\" (UniqueName: \"kubernetes.io/projected/9b467fa8-1984-4659-8873-99c20204b16b-kube-api-access-pn2dp\") pod \"ironic-operator-controller-manager-78757b4889-glbt4\" (UID: \"9b467fa8-1984-4659-8873-99c20204b16b\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.368228 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-8hmvp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.374468 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.375234 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.383313 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cv4\" (UniqueName: \"kubernetes.io/projected/589a442f-27a6-4d23-85dd-9e5b1556363f-kube-api-access-t4cv4\") pod \"mariadb-operator-controller-manager-c87fff755-v4fbm\" (UID: \"589a442f-27a6-4d23-85dd-9e5b1556363f\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.390623 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.391485 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.401283 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-btdw9" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.405083 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.415890 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.416557 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kjvw\" (UniqueName: \"kubernetes.io/projected/c38df2a4-6626-4b71-9dcd-7ef3003ee693-kube-api-access-5kjvw\") pod \"neutron-operator-controller-manager-cb4666565-d22bk\" (UID: \"c38df2a4-6626-4b71-9dcd-7ef3003ee693\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.416592 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gkdp\" (UniqueName: \"kubernetes.io/projected/370e5a87-5edf-4d48-9b65-335400a84cd2-kube-api-access-7gkdp\") pod \"manila-operator-controller-manager-864f6b75bf-n5bwd\" (UID: \"370e5a87-5edf-4d48-9b65-335400a84cd2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.416637 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d8th\" (UniqueName: \"kubernetes.io/projected/9695fd09-d135-426b-a129-66f945d2dd90-kube-api-access-9d8th\") pod \"octavia-operator-controller-manager-7fc9b76cf6-pw4z6\" (UID: \"9695fd09-d135-426b-a129-66f945d2dd90\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.454573 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.459608 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kjvw\" (UniqueName: \"kubernetes.io/projected/c38df2a4-6626-4b71-9dcd-7ef3003ee693-kube-api-access-5kjvw\") pod \"neutron-operator-controller-manager-cb4666565-d22bk\" (UID: \"c38df2a4-6626-4b71-9dcd-7ef3003ee693\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.475945 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.488330 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.502210 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gkdp\" (UniqueName: \"kubernetes.io/projected/370e5a87-5edf-4d48-9b65-335400a84cd2-kube-api-access-7gkdp\") pod \"manila-operator-controller-manager-864f6b75bf-n5bwd\" (UID: \"370e5a87-5edf-4d48-9b65-335400a84cd2\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.510211 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.511119 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527093 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d8th\" (UniqueName: \"kubernetes.io/projected/9695fd09-d135-426b-a129-66f945d2dd90-kube-api-access-9d8th\") pod \"octavia-operator-controller-manager-7fc9b76cf6-pw4z6\" (UID: \"9695fd09-d135-426b-a129-66f945d2dd90\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527251 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99khf\" (UniqueName: \"kubernetes.io/projected/bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5-kube-api-access-99khf\") pod \"nova-operator-controller-manager-65849867d6-vzzmp\" (UID: \"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527109 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.527182 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7pxct" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.563784 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.565037 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.569644 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ntr5l" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.586773 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.594277 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.596196 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628072 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzcb\" (UniqueName: \"kubernetes.io/projected/88327b24-ce00-4bb4-98d1-24060c6dbf28-kube-api-access-mlzcb\") pod \"ovn-operator-controller-manager-55db956ddc-689zh\" (UID: \"88327b24-ce00-4bb4-98d1-24060c6dbf28\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628126 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99khf\" (UniqueName: \"kubernetes.io/projected/bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5-kube-api-access-99khf\") pod \"nova-operator-controller-manager-65849867d6-vzzmp\" (UID: \"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628205 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.628257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cwxc\" (UniqueName: \"kubernetes.io/projected/88e81fdb-6501-410c-9452-d3ba7f41a30d-kube-api-access-7cwxc\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.632778 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.633551 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d8th\" (UniqueName: \"kubernetes.io/projected/9695fd09-d135-426b-a129-66f945d2dd90-kube-api-access-9d8th\") pod \"octavia-operator-controller-manager-7fc9b76cf6-pw4z6\" (UID: \"9695fd09-d135-426b-a129-66f945d2dd90\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.651573 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.652444 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.672414 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2clln"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.672859 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fmnjf" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.673410 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.680040 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-vc82s" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.687990 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99khf\" (UniqueName: \"kubernetes.io/projected/bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5-kube-api-access-99khf\") pod \"nova-operator-controller-manager-65849867d6-vzzmp\" (UID: \"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.722927 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.730007 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxxf9\" (UniqueName: \"kubernetes.io/projected/a2557af5-c155-4d37-9b9a-f9335cac47b1-kube-api-access-nxxf9\") pod \"swift-operator-controller-manager-85dd56d4cc-4tjlt\" (UID: \"a2557af5-c155-4d37-9b9a-f9335cac47b1\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.730061 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzcb\" (UniqueName: \"kubernetes.io/projected/88327b24-ce00-4bb4-98d1-24060c6dbf28-kube-api-access-mlzcb\") pod \"ovn-operator-controller-manager-55db956ddc-689zh\" (UID: \"88327b24-ce00-4bb4-98d1-24060c6dbf28\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.730101 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msw9s\" (UniqueName: \"kubernetes.io/projected/18ce7f0d-00de-4a92-97f2-743d9057abff-kube-api-access-msw9s\") pod \"placement-operator-controller-manager-686df47fcb-2clln\" (UID: \"18ce7f0d-00de-4a92-97f2-743d9057abff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.733930 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.734204 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cwxc\" (UniqueName: \"kubernetes.io/projected/88e81fdb-6501-410c-9452-d3ba7f41a30d-kube-api-access-7cwxc\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.734229 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.734141 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.734594 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.23457975 +0000 UTC m=+893.143319682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.736462 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: E0121 14:44:14.736502 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.736483522 +0000 UTC m=+893.645223454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.742194 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.755190 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.773102 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.773310 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2clln"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.773372 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.774106 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.787308 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzcb\" (UniqueName: \"kubernetes.io/projected/88327b24-ce00-4bb4-98d1-24060c6dbf28-kube-api-access-mlzcb\") pod \"ovn-operator-controller-manager-55db956ddc-689zh\" (UID: \"88327b24-ce00-4bb4-98d1-24060c6dbf28\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.791929 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-lk8rz" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.795544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cwxc\" (UniqueName: \"kubernetes.io/projected/88e81fdb-6501-410c-9452-d3ba7f41a30d-kube-api-access-7cwxc\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.836010 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmsgp\" (UniqueName: \"kubernetes.io/projected/cd17e86c-5586-4ea9-979d-2c195494fe99-kube-api-access-mmsgp\") pod \"test-operator-controller-manager-7cd8bc9dbb-xczlv\" (UID: \"cd17e86c-5586-4ea9-979d-2c195494fe99\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.836064 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxxf9\" (UniqueName: \"kubernetes.io/projected/a2557af5-c155-4d37-9b9a-f9335cac47b1-kube-api-access-nxxf9\") pod \"swift-operator-controller-manager-85dd56d4cc-4tjlt\" (UID: \"a2557af5-c155-4d37-9b9a-f9335cac47b1\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.836103 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msw9s\" (UniqueName: \"kubernetes.io/projected/18ce7f0d-00de-4a92-97f2-743d9057abff-kube-api-access-msw9s\") pod \"placement-operator-controller-manager-686df47fcb-2clln\" (UID: \"18ce7f0d-00de-4a92-97f2-743d9057abff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.862063 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msw9s\" (UniqueName: \"kubernetes.io/projected/18ce7f0d-00de-4a92-97f2-743d9057abff-kube-api-access-msw9s\") pod \"placement-operator-controller-manager-686df47fcb-2clln\" (UID: \"18ce7f0d-00de-4a92-97f2-743d9057abff\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.864469 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.873041 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.873533 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxxf9\" (UniqueName: \"kubernetes.io/projected/a2557af5-c155-4d37-9b9a-f9335cac47b1-kube-api-access-nxxf9\") pod \"swift-operator-controller-manager-85dd56d4cc-4tjlt\" (UID: \"a2557af5-c155-4d37-9b9a-f9335cac47b1\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.877174 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vptjt" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.880454 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.881256 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.896441 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-swtlp" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.928102 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.928600 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.938289 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcwnf\" (UniqueName: \"kubernetes.io/projected/de2e9655-961c-4250-9852-332dfe335b4a-kube-api-access-hcwnf\") pod \"watcher-operator-controller-manager-64cd966744-jfkfq\" (UID: \"de2e9655-961c-4250-9852-332dfe335b4a\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.938358 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmsgp\" (UniqueName: \"kubernetes.io/projected/cd17e86c-5586-4ea9-979d-2c195494fe99-kube-api-access-mmsgp\") pod \"test-operator-controller-manager-7cd8bc9dbb-xczlv\" (UID: \"cd17e86c-5586-4ea9-979d-2c195494fe99\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.938433 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xht7q\" (UniqueName: \"kubernetes.io/projected/a050e31c-3d6d-490c-8f74-637c37c96a5e-kube-api-access-xht7q\") pod \"telemetry-operator-controller-manager-5f8f495fcf-8hrkh\" (UID: \"a050e31c-3d6d-490c-8f74-637c37c96a5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.952195 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq"] Jan 21 14:44:14 crc kubenswrapper[4720]: I0121 14:44:14.984520 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.002870 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmsgp\" (UniqueName: \"kubernetes.io/projected/cd17e86c-5586-4ea9-979d-2c195494fe99-kube-api-access-mmsgp\") pod \"test-operator-controller-manager-7cd8bc9dbb-xczlv\" (UID: \"cd17e86c-5586-4ea9-979d-2c195494fe99\") " pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.003029 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.010615 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.040321 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xht7q\" (UniqueName: \"kubernetes.io/projected/a050e31c-3d6d-490c-8f74-637c37c96a5e-kube-api-access-xht7q\") pod \"telemetry-operator-controller-manager-5f8f495fcf-8hrkh\" (UID: \"a050e31c-3d6d-490c-8f74-637c37c96a5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.040600 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcwnf\" (UniqueName: \"kubernetes.io/projected/de2e9655-961c-4250-9852-332dfe335b4a-kube-api-access-hcwnf\") pod \"watcher-operator-controller-manager-64cd966744-jfkfq\" (UID: \"de2e9655-961c-4250-9852-332dfe335b4a\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.106082 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xht7q\" (UniqueName: \"kubernetes.io/projected/a050e31c-3d6d-490c-8f74-637c37c96a5e-kube-api-access-xht7q\") pod \"telemetry-operator-controller-manager-5f8f495fcf-8hrkh\" (UID: \"a050e31c-3d6d-490c-8f74-637c37c96a5e\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.120899 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcwnf\" (UniqueName: \"kubernetes.io/projected/de2e9655-961c-4250-9852-332dfe335b4a-kube-api-access-hcwnf\") pod \"watcher-operator-controller-manager-64cd966744-jfkfq\" (UID: \"de2e9655-961c-4250-9852-332dfe335b4a\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.184615 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.186488 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.212380 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.228263 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.245995 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.246159 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.246263 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rgchh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.248224 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.248409 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.248457 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:16.248443721 +0000 UTC m=+894.157183653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.257023 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.263382 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.292162 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.293084 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.295581 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.304379 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-cnwp6" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.304918 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.334914 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.352585 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2pc5\" (UniqueName: \"kubernetes.io/projected/eb81b686-832a-414b-aa66-cf40a72a7427-kube-api-access-x2pc5\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.352675 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.352698 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.410417 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.455307 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2pc5\" (UniqueName: \"kubernetes.io/projected/eb81b686-832a-414b-aa66-cf40a72a7427-kube-api-access-x2pc5\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.458693 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.458743 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.458795 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2zp6\" (UniqueName: \"kubernetes.io/projected/8db4bced-5679-43ab-a5c9-ba87574aaa02-kube-api-access-d2zp6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mm7cg\" (UID: \"8db4bced-5679-43ab-a5c9-ba87574aaa02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460201 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460261 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.960242437 +0000 UTC m=+893.868982369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460391 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.460424 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:15.960415302 +0000 UTC m=+893.869155234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.463933 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.499040 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2pc5\" (UniqueName: \"kubernetes.io/projected/eb81b686-832a-414b-aa66-cf40a72a7427-kube-api-access-x2pc5\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: W0121 14:44:15.520140 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96218341_1cf7_4aa1_bb9a_7a7abba7a93e.slice/crio-8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32 WatchSource:0}: Error finding container 8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32: Status 404 returned error can't find the container with id 8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32 Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.560578 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2zp6\" (UniqueName: \"kubernetes.io/projected/8db4bced-5679-43ab-a5c9-ba87574aaa02-kube-api-access-d2zp6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mm7cg\" (UID: \"8db4bced-5679-43ab-a5c9-ba87574aaa02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.585358 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2zp6\" (UniqueName: \"kubernetes.io/projected/8db4bced-5679-43ab-a5c9-ba87574aaa02-kube-api-access-d2zp6\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mm7cg\" (UID: \"8db4bced-5679-43ab-a5c9-ba87574aaa02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.621536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" event={"ID":"6c93648a-7076-4d91-ac7a-f389ab1159cc","Type":"ContainerStarted","Data":"4de74b9430eca3d9796788dbf314218cdbdfb3a296b55018ba2792fc21dfd78a"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.639708 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" event={"ID":"96218341-1cf7-4aa1-bb9a-7a7abba7a93e","Type":"ContainerStarted","Data":"8b2392c4209a5b5a84630e43fe459621efb9dc14702481718a946eb135f39d32"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.640650 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" event={"ID":"655f8c6a-4936-45d3-9538-66ee77a050d3","Type":"ContainerStarted","Data":"467e42d2bfb99b52210143ad72cbe50f704bba71478ce48827add1fd22fe519d"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.642638 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" event={"ID":"b7ea6739-9c38-44a0-a382-8b26e37138fa","Type":"ContainerStarted","Data":"2709ab9bd89c9a2bd6e188957fe2bc34182484aa9343c08e775b56940d6423af"} Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.654085 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.763884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.764479 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.764585 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:17.764562233 +0000 UTC m=+895.673302165 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.924917 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.959276 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh"] Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.970224 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: I0121 14:44:15.970266 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970436 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970483 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:16.970467699 +0000 UTC m=+894.879207631 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970846 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:15 crc kubenswrapper[4720]: E0121 14:44:15.970872 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:16.97086362 +0000 UTC m=+894.879603552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.002502 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.010343 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.076540 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh"] Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.110712 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88327b24_ce00_4bb4_98d1_24060c6dbf28.slice/crio-7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5 WatchSource:0}: Error finding container 7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5: Status 404 returned error can't find the container with id 7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5 Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.113930 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.121731 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.126262 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.171177 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.208545 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.218376 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.227743 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-2clln"] Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.235280 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxxf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-4tjlt_openstack-operators(a2557af5-c155-4d37-9b9a-f9335cac47b1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.235413 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq"] Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.236480 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.241876 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2e9655_961c_4250_9852_332dfe335b4a.slice/crio-bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6 WatchSource:0}: Error finding container bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6: Status 404 returned error can't find the container with id bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6 Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.243582 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcwnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-jfkfq_openstack-operators(de2e9655-961c-4250-9852-332dfe335b4a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.246753 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.277773 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.277974 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.278037 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:18.278018862 +0000 UTC m=+896.186758794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.340609 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv"] Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.344746 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh"] Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.349494 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda050e31c_3d6d_490c_8f74_637c37c96a5e.slice/crio-06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311 WatchSource:0}: Error finding container 06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311: Status 404 returned error can't find the container with id 06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311 Jan 21 14:44:16 crc kubenswrapper[4720]: W0121 14:44:16.359771 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd17e86c_5586_4ea9_979d_2c195494fe99.slice/crio-6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72 WatchSource:0}: Error finding container 6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72: Status 404 returned error can't find the container with id 6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72 Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.363392 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mmsgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-xczlv_openstack-operators(cd17e86c-5586-4ea9-979d-2c195494fe99): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.365808 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.375627 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg"] Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.383403 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d2zp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mm7cg_openstack-operators(8db4bced-5679-43ab-a5c9-ba87574aaa02): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.384719 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.663482 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" event={"ID":"9a5569f7-371f-4663-b005-5fdcce36936b","Type":"ContainerStarted","Data":"d85df34e81d1295c94ffdc4e8d4ca0bd384c874626cc4ba5f395a80812343576"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.666435 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" event={"ID":"de2e9655-961c-4250-9852-332dfe335b4a","Type":"ContainerStarted","Data":"bdc37c42c632a0248431aae7514456debbcda86ac5a41fa2bc84cfaa3e4014b6"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.668106 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" event={"ID":"c38df2a4-6626-4b71-9dcd-7ef3003ee693","Type":"ContainerStarted","Data":"ad986fcfad4b159d41f754a788241bae78656828028b3ea9e5f35412c2c6f264"} Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.676291 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.676694 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" event={"ID":"8db4bced-5679-43ab-a5c9-ba87574aaa02","Type":"ContainerStarted","Data":"356ea00aac9afd839cc8b829df740b648a053aed9dbb2c957aeca9749ccd4ef3"} Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.678162 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.682920 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702320 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" event={"ID":"18ce7f0d-00de-4a92-97f2-743d9057abff","Type":"ContainerStarted","Data":"64698aed373aa846d9adffb375d485022a13648703c14b7e2fe148020540c9be"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702364 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" event={"ID":"a2557af5-c155-4d37-9b9a-f9335cac47b1","Type":"ContainerStarted","Data":"e23be9095254d2e90d03ef41e4d3a169308157a70bc64dfbc3d8a67df557b34f"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702376 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" event={"ID":"88327b24-ce00-4bb4-98d1-24060c6dbf28","Type":"ContainerStarted","Data":"7694e840a979c59166703c99cb417768ff92878a5362dd1a01731e7599a35fd5"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.702386 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" event={"ID":"071d4469-5b09-49a3-97f4-239d811825a2","Type":"ContainerStarted","Data":"193be7305a6eac486d49dcadcae7aa92048fd21dfcd08a156d4cd642174269ae"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.703942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" event={"ID":"9695fd09-d135-426b-a129-66f945d2dd90","Type":"ContainerStarted","Data":"1b008f060ae4d6cec2a9348eee3da41d87032c67738d60165b931e2544775587"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.709827 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" event={"ID":"a050e31c-3d6d-490c-8f74-637c37c96a5e","Type":"ContainerStarted","Data":"06d837f977636e63a5658d405a3f48a28bd8320d2a57303fecc15278cfd0e311"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.723722 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" event={"ID":"cd17e86c-5586-4ea9-979d-2c195494fe99","Type":"ContainerStarted","Data":"6598d017c2723c09d88d01d101834943cd3f4611b071025a816a08575d8fcf72"} Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.725570 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.728602 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" event={"ID":"085a2e93-1496-47f3-a7dc-4acae2e201fc","Type":"ContainerStarted","Data":"2d211466bdd1649f8c106eac2eb3ce8f05c80b13ca75b726ac9cbaecc9fd6f01"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.732300 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" event={"ID":"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5","Type":"ContainerStarted","Data":"43584b1caac15137c1334b925a24f0376a433f856b9c58bd0b4db09ea7599a13"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.741243 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" event={"ID":"370e5a87-5edf-4d48-9b65-335400a84cd2","Type":"ContainerStarted","Data":"00ce8de16f0d8b050f9fea187c22f6dc461f2628f617aacde9fa9fa92fbeb733"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.743398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" event={"ID":"589a442f-27a6-4d23-85dd-9e5b1556363f","Type":"ContainerStarted","Data":"ceb822f4a19e352e475c926fe67026a70f3f3d82505ab054b023f9126c6692c5"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.745022 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" event={"ID":"9b467fa8-1984-4659-8873-99c20204b16b","Type":"ContainerStarted","Data":"3be16a220f0ce2b9b4d7071b9308f8ebd3d77b11b12a54736d85d3714d83c6b9"} Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.988782 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:16 crc kubenswrapper[4720]: I0121 14:44:16.989069 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.988944 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.989204 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:18.989190284 +0000 UTC m=+896.897930216 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.989158 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:16 crc kubenswrapper[4720]: E0121 14:44:16.989546 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:18.989535184 +0000 UTC m=+896.898275106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.756924 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.757248 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.757299 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.757365 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:17 crc kubenswrapper[4720]: I0121 14:44:17.809376 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.811612 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:17 crc kubenswrapper[4720]: E0121 14:44:17.823820 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:21.823771077 +0000 UTC m=+899.732511009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:18 crc kubenswrapper[4720]: I0121 14:44:18.340109 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:18 crc kubenswrapper[4720]: E0121 14:44:18.340274 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:18 crc kubenswrapper[4720]: E0121 14:44:18.340333 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:22.34031613 +0000 UTC m=+900.249056062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: I0121 14:44:19.056352 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:19 crc kubenswrapper[4720]: I0121 14:44:19.056397 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.057823 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.057898 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:23.057881167 +0000 UTC m=+900.966621099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.058641 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:19 crc kubenswrapper[4720]: E0121 14:44:19.058837 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:23.058813862 +0000 UTC m=+900.967553794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:21 crc kubenswrapper[4720]: I0121 14:44:21.911598 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:21 crc kubenswrapper[4720]: E0121 14:44:21.912000 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:21 crc kubenswrapper[4720]: E0121 14:44:21.912053 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:29.912035824 +0000 UTC m=+907.820775756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:22 crc kubenswrapper[4720]: I0121 14:44:22.418840 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:22 crc kubenswrapper[4720]: E0121 14:44:22.418981 4720 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:22 crc kubenswrapper[4720]: E0121 14:44:22.419028 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert podName:88e81fdb-6501-410c-9452-d3ba7f41a30d nodeName:}" failed. No retries permitted until 2026-01-21 14:44:30.419014067 +0000 UTC m=+908.327753999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" (UID: "88e81fdb-6501-410c-9452-d3ba7f41a30d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: I0121 14:44:23.136953 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:23 crc kubenswrapper[4720]: I0121 14:44:23.138198 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.138166 4720 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.138514 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:31.138499875 +0000 UTC m=+909.047239807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "metrics-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.138313 4720 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 21 14:44:23 crc kubenswrapper[4720]: E0121 14:44:23.139001 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs podName:eb81b686-832a-414b-aa66-cf40a72a7427 nodeName:}" failed. No retries permitted until 2026-01-21 14:44:31.138991968 +0000 UTC m=+909.047731900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs") pod "openstack-operator-controller-manager-d47656bc9-4hjmr" (UID: "eb81b686-832a-414b-aa66-cf40a72a7427") : secret "webhook-server-cert" not found Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.411154 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.411716 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fl7q2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-9b68f5989-wnzfm_openstack-operators(b7ea6739-9c38-44a0-a382-8b26e37138fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.412912 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" podUID="b7ea6739-9c38-44a0-a382-8b26e37138fa" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.880758 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" podUID="b7ea6739-9c38-44a0-a382-8b26e37138fa" Jan 21 14:44:29 crc kubenswrapper[4720]: I0121 14:44:29.953533 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.955561 4720 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:29 crc kubenswrapper[4720]: E0121 14:44:29.955695 4720 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert podName:b80cffaf-5853-47ac-b783-c26da64425ff nodeName:}" failed. No retries permitted until 2026-01-21 14:44:45.955630499 +0000 UTC m=+923.864370431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert") pod "infra-operator-controller-manager-77c48c7859-xtpbn" (UID: "b80cffaf-5853-47ac-b783-c26da64425ff") : secret "infra-operator-webhook-server-cert" not found Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.461269 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.470789 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88e81fdb-6501-410c-9452-d3ba7f41a30d-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw\" (UID: \"88e81fdb-6501-410c-9452-d3ba7f41a30d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.752626 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-7pxct" Jan 21 14:44:30 crc kubenswrapper[4720]: I0121 14:44:30.761917 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.172602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.172677 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.178855 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-metrics-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.184866 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.185035 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t4cv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-v4fbm_openstack-operators(589a442f-27a6-4d23-85dd-9e5b1556363f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.186170 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" podUID="589a442f-27a6-4d23-85dd-9e5b1556363f" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.186963 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/eb81b686-832a-414b-aa66-cf40a72a7427-webhook-certs\") pod \"openstack-operator-controller-manager-d47656bc9-4hjmr\" (UID: \"eb81b686-832a-414b-aa66-cf40a72a7427\") " pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.466206 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-rgchh" Jan 21 14:44:31 crc kubenswrapper[4720]: I0121 14:44:31.475409 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.829547 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.829759 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-msw9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-2clln_openstack-operators(18ce7f0d-00de-4a92-97f2-743d9057abff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.830905 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" podUID="18ce7f0d-00de-4a92-97f2-743d9057abff" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.887773 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" podUID="18ce7f0d-00de-4a92-97f2-743d9057abff" Jan 21 14:44:31 crc kubenswrapper[4720]: E0121 14:44:31.887859 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" podUID="589a442f-27a6-4d23-85dd-9e5b1556363f" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.142567 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.143192 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-99khf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-vzzmp_openstack-operators(bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.144728 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" podUID="bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5" Jan 21 14:44:35 crc kubenswrapper[4720]: E0121 14:44:35.919497 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" podUID="bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.499241 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.499454 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p2nlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-594c8c9d5d-bl4z8_openstack-operators(9a5569f7-371f-4663-b005-5fdcce36936b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.500752 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" podUID="9a5569f7-371f-4663-b005-5fdcce36936b" Jan 21 14:44:37 crc kubenswrapper[4720]: E0121 14:44:37.932452 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:2f9a2f064448faebbae58f52d564dc0e8e39bed0fc12bd6b9fe925e42f1b5492\\\"\"" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" podUID="9a5569f7-371f-4663-b005-5fdcce36936b" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.228973 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.229201 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xht7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-8hrkh_openstack-operators(a050e31c-3d6d-490c-8f74-637c37c96a5e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.230576 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" podUID="a050e31c-3d6d-490c-8f74-637c37c96a5e" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.914237 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.914562 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twq9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7ddb5c749-q2t2m_openstack-operators(655f8c6a-4936-45d3-9538-66ee77a050d3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.916701 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" podUID="655f8c6a-4936-45d3-9538-66ee77a050d3" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.936387 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" podUID="a050e31c-3d6d-490c-8f74-637c37c96a5e" Jan 21 14:44:38 crc kubenswrapper[4720]: E0121 14:44:38.936929 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" podUID="655f8c6a-4936-45d3-9538-66ee77a050d3" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.515931 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.516135 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7gkdp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-864f6b75bf-n5bwd_openstack-operators(370e5a87-5edf-4d48-9b65-335400a84cd2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.517540 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" podUID="370e5a87-5edf-4d48-9b65-335400a84cd2" Jan 21 14:44:39 crc kubenswrapper[4720]: E0121 14:44:39.941368 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:fd2e631e747c35a95f083418f5829d06c4b830f1fdb322368ff6190b9887ea32\\\"\"" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" podUID="370e5a87-5edf-4d48-9b65-335400a84cd2" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.149297 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.150092 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mlzcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-55db956ddc-689zh_openstack-operators(88327b24-ce00-4bb4-98d1-24060c6dbf28): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.151386 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" podUID="88327b24-ce00-4bb4-98d1-24060c6dbf28" Jan 21 14:44:40 crc kubenswrapper[4720]: E0121 14:44:40.946382 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:8b3bfb9e86618b7ac69443939b0968fae28a22cd62ea1e429b599ff9f8a5f8cf\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" podUID="88327b24-ce00-4bb4-98d1-24060c6dbf28" Jan 21 14:44:43 crc kubenswrapper[4720]: I0121 14:44:43.682778 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.270820 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.271082 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pn2dp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-78757b4889-glbt4_openstack-operators(9b467fa8-1984-4659-8873-99c20204b16b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.272308 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" podUID="9b467fa8-1984-4659-8873-99c20204b16b" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.851021 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.851649 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ww4p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-c6994669c-gwlgm_openstack-operators(6c93648a-7076-4d91-ac7a-f389ab1159cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.854189 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" podUID="6c93648a-7076-4d91-ac7a-f389ab1159cc" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.978097 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:d69a68cdac59165797daf1064f3a3b4b14b546bf1c7254070a7ed1238998c028\\\"\"" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" podUID="6c93648a-7076-4d91-ac7a-f389ab1159cc" Jan 21 14:44:45 crc kubenswrapper[4720]: E0121 14:44:45.978217 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" podUID="9b467fa8-1984-4659-8873-99c20204b16b" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.008196 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.021404 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80cffaf-5853-47ac-b783-c26da64425ff-cert\") pod \"infra-operator-controller-manager-77c48c7859-xtpbn\" (UID: \"b80cffaf-5853-47ac-b783-c26da64425ff\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.047448 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zcxwg" Jan 21 14:44:46 crc kubenswrapper[4720]: I0121 14:44:46.057107 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.454255 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.454434 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9d8th,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7fc9b76cf6-pw4z6_openstack-operators(9695fd09-d135-426b-a129-66f945d2dd90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.455630 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" podUID="9695fd09-d135-426b-a129-66f945d2dd90" Jan 21 14:44:46 crc kubenswrapper[4720]: E0121 14:44:46.991292 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ab629ec4ce57b5cde9cd6d75069e68edca441b97b7b5a3f58804e2e61766b729\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" podUID="9695fd09-d135-426b-a129-66f945d2dd90" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.591242 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.591531 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5kjvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-cb4666565-d22bk_openstack-operators(c38df2a4-6626-4b71-9dcd-7ef3003ee693): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.592705 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" podUID="c38df2a4-6626-4b71-9dcd-7ef3003ee693" Jan 21 14:44:48 crc kubenswrapper[4720]: E0121 14:44:48.993867 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0f440bf7dc937ce0135bdd328716686fd2f1320f453a9ac4e11e96383148ad6c\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" podUID="c38df2a4-6626-4b71-9dcd-7ef3003ee693" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.395943 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.396116 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nxxf9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-4tjlt_openstack-operators(a2557af5-c155-4d37-9b9a-f9335cac47b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.397338 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.977332 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.977722 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcwnf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-jfkfq_openstack-operators(de2e9655-961c-4250-9852-332dfe335b4a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:49 crc kubenswrapper[4720]: E0121 14:44:49.981511 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.199554 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.199717 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mmsgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7cd8bc9dbb-xczlv_openstack-operators(cd17e86c-5586-4ea9-979d-2c195494fe99): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.201378 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.729850 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.730153 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m8fx5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-54hwg_openstack-operators(085a2e93-1496-47f3-a7dc-4acae2e201fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:54 crc kubenswrapper[4720]: E0121 14:44:54.731303 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" podUID="085a2e93-1496-47f3-a7dc-4acae2e201fc" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.037392 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" podUID="085a2e93-1496-47f3-a7dc-4acae2e201fc" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.104183 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.104344 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d2zp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mm7cg_openstack-operators(8db4bced-5679-43ab-a5c9-ba87574aaa02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:44:55 crc kubenswrapper[4720]: E0121 14:44:55.105562 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:44:55 crc kubenswrapper[4720]: I0121 14:44:55.769824 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr"] Jan 21 14:44:55 crc kubenswrapper[4720]: I0121 14:44:55.906681 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw"] Jan 21 14:44:55 crc kubenswrapper[4720]: I0121 14:44:55.994901 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn"] Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.043072 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" event={"ID":"b7ea6739-9c38-44a0-a382-8b26e37138fa","Type":"ContainerStarted","Data":"25dc0e69a9ca5bfa199a30b67aa5e77dece5373b5796de32a87b545138d95826"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.043290 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.044954 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" event={"ID":"bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5","Type":"ContainerStarted","Data":"02714d73a06c1ab2e0e827cee30936b4ad1bf1a5b5f5f7c174e19e6bb1418af1"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.045134 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.048067 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" event={"ID":"18ce7f0d-00de-4a92-97f2-743d9057abff","Type":"ContainerStarted","Data":"039b7dd4a95f9875eb3baf220815896626a7694bb2e2236faadabdbf7a8345f0"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.048514 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.052504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" event={"ID":"eb81b686-832a-414b-aa66-cf40a72a7427","Type":"ContainerStarted","Data":"edb99a7ba20c3db307a55145637fcd23c1ea76a16d7e92449ca0d659fedbf714"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.053741 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" event={"ID":"370e5a87-5edf-4d48-9b65-335400a84cd2","Type":"ContainerStarted","Data":"115ea91d56873cb47c112be05820d5378f2f171bcf58765f5ce91b4a2f851cd9"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.053942 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.055199 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" event={"ID":"b80cffaf-5853-47ac-b783-c26da64425ff","Type":"ContainerStarted","Data":"317ba5688fe50672c7df2a62fda41a00f31d58d621aa4f3411f6bf91cf1548c5"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.057484 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" event={"ID":"88e81fdb-6501-410c-9452-d3ba7f41a30d","Type":"ContainerStarted","Data":"b7814ceac5f9c7ed17ca9501dbd04b92f686ec24c1a5d0d24cdbf4eb92168380"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.069948 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" podStartSLOduration=2.982712792 podStartE2EDuration="43.069928208s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.123853839 +0000 UTC m=+893.032593761" lastFinishedPulling="2026-01-21 14:44:55.211069225 +0000 UTC m=+933.119809177" observedRunningTime="2026-01-21 14:44:56.061858608 +0000 UTC m=+933.970598540" watchObservedRunningTime="2026-01-21 14:44:56.069928208 +0000 UTC m=+933.978668140" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.071374 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" event={"ID":"655f8c6a-4936-45d3-9538-66ee77a050d3","Type":"ContainerStarted","Data":"dd8a820b6f770e56cc5f4fb8508a7ff4c0a5b576c11c4c670f11ff60076c068e"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.072072 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.078584 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" event={"ID":"071d4469-5b09-49a3-97f4-239d811825a2","Type":"ContainerStarted","Data":"bff2b48d1274a26c4af5723d29cc757b4ea18436a4199cc6eba12a248bc8a98a"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.079200 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.090143 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" event={"ID":"9a5569f7-371f-4663-b005-5fdcce36936b","Type":"ContainerStarted","Data":"c1139b85e5ee1a71aef08278791cf53cf15521556580d8333fc6e5443dc4344b"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.090319 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.091588 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" event={"ID":"589a442f-27a6-4d23-85dd-9e5b1556363f","Type":"ContainerStarted","Data":"9fe05ff653c035141be9155e808b3735cd2dc809ce42e617e65f99a4b72de9a4"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.092244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.093642 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" event={"ID":"96218341-1cf7-4aa1-bb9a-7a7abba7a93e","Type":"ContainerStarted","Data":"88e33885539513554b4b6e245a5a208386045d6f526ab5a4f2b699eff92ffbd3"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.094012 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.095018 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" event={"ID":"a050e31c-3d6d-490c-8f74-637c37c96a5e","Type":"ContainerStarted","Data":"a556a5f3bb89df39fb8dd1f31bbf4c28e1d6092323d424ab58c60dceb82bb1fa"} Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.095360 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.112831 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" podStartSLOduration=3.128621411 podStartE2EDuration="42.112809296s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.225438451 +0000 UTC m=+894.134178383" lastFinishedPulling="2026-01-21 14:44:55.209626326 +0000 UTC m=+933.118366268" observedRunningTime="2026-01-21 14:44:56.112529318 +0000 UTC m=+934.021269250" watchObservedRunningTime="2026-01-21 14:44:56.112809296 +0000 UTC m=+934.021549228" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.114556 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" podStartSLOduration=3.070700744 podStartE2EDuration="42.114546693s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.225018429 +0000 UTC m=+894.133758361" lastFinishedPulling="2026-01-21 14:44:55.268864378 +0000 UTC m=+933.177604310" observedRunningTime="2026-01-21 14:44:56.08544808 +0000 UTC m=+933.994188012" watchObservedRunningTime="2026-01-21 14:44:56.114546693 +0000 UTC m=+934.023286655" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.136045 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" podStartSLOduration=3.026714726 podStartE2EDuration="42.136028767s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.181867855 +0000 UTC m=+894.090607787" lastFinishedPulling="2026-01-21 14:44:55.291181896 +0000 UTC m=+933.199921828" observedRunningTime="2026-01-21 14:44:56.135237316 +0000 UTC m=+934.043977248" watchObservedRunningTime="2026-01-21 14:44:56.136028767 +0000 UTC m=+934.044768699" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.159513 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" podStartSLOduration=2.880504626 podStartE2EDuration="42.159491936s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.932874726 +0000 UTC m=+893.841614658" lastFinishedPulling="2026-01-21 14:44:55.211862036 +0000 UTC m=+933.120601968" observedRunningTime="2026-01-21 14:44:56.154816139 +0000 UTC m=+934.063556081" watchObservedRunningTime="2026-01-21 14:44:56.159491936 +0000 UTC m=+934.068231868" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.242505 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" podStartSLOduration=6.613191276 podStartE2EDuration="43.242483146s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.989812595 +0000 UTC m=+893.898552527" lastFinishedPulling="2026-01-21 14:44:52.619104425 +0000 UTC m=+930.527844397" observedRunningTime="2026-01-21 14:44:56.239817653 +0000 UTC m=+934.148557585" watchObservedRunningTime="2026-01-21 14:44:56.242483146 +0000 UTC m=+934.151223078" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.304369 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" podStartSLOduration=4.070653413 podStartE2EDuration="43.304351881s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.060394287 +0000 UTC m=+893.969134219" lastFinishedPulling="2026-01-21 14:44:55.294092755 +0000 UTC m=+933.202832687" observedRunningTime="2026-01-21 14:44:56.299696383 +0000 UTC m=+934.208436315" watchObservedRunningTime="2026-01-21 14:44:56.304351881 +0000 UTC m=+934.213091813" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.305200 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" podStartSLOduration=3.389212635 podStartE2EDuration="42.305194183s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.351499293 +0000 UTC m=+894.260239225" lastFinishedPulling="2026-01-21 14:44:55.267480841 +0000 UTC m=+933.176220773" observedRunningTime="2026-01-21 14:44:56.274852727 +0000 UTC m=+934.183592659" watchObservedRunningTime="2026-01-21 14:44:56.305194183 +0000 UTC m=+934.213934105" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.358343 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" podStartSLOduration=3.595905486 podStartE2EDuration="43.35832351s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.51581347 +0000 UTC m=+893.424553402" lastFinishedPulling="2026-01-21 14:44:55.278231494 +0000 UTC m=+933.186971426" observedRunningTime="2026-01-21 14:44:56.32896214 +0000 UTC m=+934.237702072" watchObservedRunningTime="2026-01-21 14:44:56.35832351 +0000 UTC m=+934.267063432" Jan 21 14:44:56 crc kubenswrapper[4720]: I0121 14:44:56.360521 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" podStartSLOduration=6.27907441 podStartE2EDuration="43.36051636s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.5374658 +0000 UTC m=+893.446205732" lastFinishedPulling="2026-01-21 14:44:52.61890772 +0000 UTC m=+930.527647682" observedRunningTime="2026-01-21 14:44:56.353630172 +0000 UTC m=+934.262370104" watchObservedRunningTime="2026-01-21 14:44:56.36051636 +0000 UTC m=+934.269256292" Jan 21 14:44:57 crc kubenswrapper[4720]: I0121 14:44:57.106705 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" event={"ID":"eb81b686-832a-414b-aa66-cf40a72a7427","Type":"ContainerStarted","Data":"19ff05a60a49d9a47c6a2ed4ccfca59971a386247e464f0f2626c2650e73f180"} Jan 21 14:44:57 crc kubenswrapper[4720]: I0121 14:44:57.196076 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" podStartSLOduration=43.196056138 podStartE2EDuration="43.196056138s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:44:57.179813016 +0000 UTC m=+935.088552958" watchObservedRunningTime="2026-01-21 14:44:57.196056138 +0000 UTC m=+935.104796070" Jan 21 14:44:58 crc kubenswrapper[4720]: I0121 14:44:58.117193 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" event={"ID":"88327b24-ce00-4bb4-98d1-24060c6dbf28","Type":"ContainerStarted","Data":"7fd2f9a820b98ec31d5d549fa09baa4a93071ad8a796273e83096f99e20d28d1"} Jan 21 14:44:58 crc kubenswrapper[4720]: I0121 14:44:58.117535 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:44:58 crc kubenswrapper[4720]: I0121 14:44:58.138311 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" podStartSLOduration=3.021170835 podStartE2EDuration="44.138295071s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.14166672 +0000 UTC m=+894.050406652" lastFinishedPulling="2026-01-21 14:44:57.258790956 +0000 UTC m=+935.167530888" observedRunningTime="2026-01-21 14:44:58.133819449 +0000 UTC m=+936.042559381" watchObservedRunningTime="2026-01-21 14:44:58.138295071 +0000 UTC m=+936.047035003" Jan 21 14:44:59 crc kubenswrapper[4720]: E0121 14:44:59.681565 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podUID="a2557af5-c155-4d37-9b9a-f9335cac47b1" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.175113 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg"] Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.177834 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.182101 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.182175 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.193017 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg"] Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.320970 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.321004 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.321041 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422012 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422055 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422095 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.422976 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.430060 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.440763 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"collect-profiles-29483445-s84lg\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: I0121 14:45:00.505111 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:00 crc kubenswrapper[4720]: E0121 14:45:00.730350 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podUID="de2e9655-961c-4250-9852-332dfe335b4a" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.135598 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" event={"ID":"9b467fa8-1984-4659-8873-99c20204b16b","Type":"ContainerStarted","Data":"1bfcc1dd03888aef23c8e6b3d57950f247f7c32e68f8977a1c9da1be4fc97a20"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.136093 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.138813 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" event={"ID":"c38df2a4-6626-4b71-9dcd-7ef3003ee693","Type":"ContainerStarted","Data":"a77758167e47a97957508571a71924682fe25a9b67a056bc8fac33da30ae5c19"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.139341 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.141000 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" event={"ID":"6c93648a-7076-4d91-ac7a-f389ab1159cc","Type":"ContainerStarted","Data":"fbc6fca7434cef1e381fd6d87885e6a980bd13602aa13f1164869ad731a0e6d4"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.141161 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.144378 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" event={"ID":"b80cffaf-5853-47ac-b783-c26da64425ff","Type":"ContainerStarted","Data":"5e5cb687a7f92716f2cb25340a10dc99596dca88bc80c02499ffd1f7c4b6cfea"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.144543 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.146333 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" event={"ID":"9695fd09-d135-426b-a129-66f945d2dd90","Type":"ContainerStarted","Data":"759f94efc0cae9fd026c4ad9faef3e14ae101ea44c8e128b12e4ab140e7f5779"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.146863 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.148170 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" event={"ID":"88e81fdb-6501-410c-9452-d3ba7f41a30d","Type":"ContainerStarted","Data":"4221bf11b4cb7619f41f6fcea41e7817f6ec752ba991e56eab430bbf275c95e3"} Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.148604 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.158890 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" podStartSLOduration=3.45389838 podStartE2EDuration="48.15886659s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.099507772 +0000 UTC m=+894.008247704" lastFinishedPulling="2026-01-21 14:45:00.804475982 +0000 UTC m=+938.713215914" observedRunningTime="2026-01-21 14:45:01.151912171 +0000 UTC m=+939.060652123" watchObservedRunningTime="2026-01-21 14:45:01.15886659 +0000 UTC m=+939.067606542" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.179904 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" podStartSLOduration=2.47477383 podStartE2EDuration="47.179883053s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.09940824 +0000 UTC m=+894.008148172" lastFinishedPulling="2026-01-21 14:45:00.804517463 +0000 UTC m=+938.713257395" observedRunningTime="2026-01-21 14:45:01.172067479 +0000 UTC m=+939.080807421" watchObservedRunningTime="2026-01-21 14:45:01.179883053 +0000 UTC m=+939.088622995" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.220220 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" podStartSLOduration=2.534013172 podStartE2EDuration="47.22020451s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.113441742 +0000 UTC m=+894.022181684" lastFinishedPulling="2026-01-21 14:45:00.79963309 +0000 UTC m=+938.708373022" observedRunningTime="2026-01-21 14:45:01.202592971 +0000 UTC m=+939.111332923" watchObservedRunningTime="2026-01-21 14:45:01.22020451 +0000 UTC m=+939.128944442" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.221424 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg"] Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.222184 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" podStartSLOduration=2.714369686 podStartE2EDuration="48.222178034s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:15.274272754 +0000 UTC m=+893.183012686" lastFinishedPulling="2026-01-21 14:45:00.782081102 +0000 UTC m=+938.690821034" observedRunningTime="2026-01-21 14:45:01.219217713 +0000 UTC m=+939.127957655" watchObservedRunningTime="2026-01-21 14:45:01.222178034 +0000 UTC m=+939.130917966" Jan 21 14:45:01 crc kubenswrapper[4720]: W0121 14:45:01.235568 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56142c27_a9f6_4617_ad00_0d1cd7416732.slice/crio-dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e WatchSource:0}: Error finding container dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e: Status 404 returned error can't find the container with id dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.254215 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" podStartSLOduration=43.471312606 podStartE2EDuration="48.254166125s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:56.016488853 +0000 UTC m=+933.925228785" lastFinishedPulling="2026-01-21 14:45:00.799342372 +0000 UTC m=+938.708082304" observedRunningTime="2026-01-21 14:45:01.251117602 +0000 UTC m=+939.159857554" watchObservedRunningTime="2026-01-21 14:45:01.254166125 +0000 UTC m=+939.162906067" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.324635 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" podStartSLOduration=42.507523081 podStartE2EDuration="47.324614563s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:55.987654888 +0000 UTC m=+933.896409880" lastFinishedPulling="2026-01-21 14:45:00.80476143 +0000 UTC m=+938.713501362" observedRunningTime="2026-01-21 14:45:01.308885635 +0000 UTC m=+939.217625577" watchObservedRunningTime="2026-01-21 14:45:01.324614563 +0000 UTC m=+939.233354495" Jan 21 14:45:01 crc kubenswrapper[4720]: I0121 14:45:01.481703 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-d47656bc9-4hjmr" Jan 21 14:45:02 crc kubenswrapper[4720]: I0121 14:45:02.157326 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerStarted","Data":"a185df7caa4972969a59754656c2e094afde3b9115ea7c3b115437cf0a6c85a1"} Jan 21 14:45:02 crc kubenswrapper[4720]: I0121 14:45:02.157605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerStarted","Data":"dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e"} Jan 21 14:45:02 crc kubenswrapper[4720]: I0121 14:45:02.179168 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" podStartSLOduration=2.179150348 podStartE2EDuration="2.179150348s" podCreationTimestamp="2026-01-21 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:45:02.174211294 +0000 UTC m=+940.082951236" watchObservedRunningTime="2026-01-21 14:45:02.179150348 +0000 UTC m=+940.087890280" Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.164354 4720 generic.go:334] "Generic (PLEG): container finished" podID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerID="a185df7caa4972969a59754656c2e094afde3b9115ea7c3b115437cf0a6c85a1" exitCode=0 Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.164414 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerDied","Data":"a185df7caa4972969a59754656c2e094afde3b9115ea7c3b115437cf0a6c85a1"} Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.866968 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-wnzfm" Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.932155 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-bjn2r" Jan 21 14:45:03 crc kubenswrapper[4720]: I0121 14:45:03.934083 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-q2t2m" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.066721 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-vfxfh" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.377639 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-bl4z8" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.408347 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-v4fbm" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.464197 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.581953 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") pod \"56142c27-a9f6-4617-ad00-0d1cd7416732\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.582304 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") pod \"56142c27-a9f6-4617-ad00-0d1cd7416732\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.582338 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") pod \"56142c27-a9f6-4617-ad00-0d1cd7416732\" (UID: \"56142c27-a9f6-4617-ad00-0d1cd7416732\") " Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.582821 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume" (OuterVolumeSpecName: "config-volume") pod "56142c27-a9f6-4617-ad00-0d1cd7416732" (UID: "56142c27-a9f6-4617-ad00-0d1cd7416732"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.587053 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "56142c27-a9f6-4617-ad00-0d1cd7416732" (UID: "56142c27-a9f6-4617-ad00-0d1cd7416732"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.603033 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr" (OuterVolumeSpecName: "kube-api-access-cwvxr") pod "56142c27-a9f6-4617-ad00-0d1cd7416732" (UID: "56142c27-a9f6-4617-ad00-0d1cd7416732"). InnerVolumeSpecName "kube-api-access-cwvxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.684416 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/56142c27-a9f6-4617-ad00-0d1cd7416732-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.684454 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56142c27-a9f6-4617-ad00-0d1cd7416732-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.684466 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwvxr\" (UniqueName: \"kubernetes.io/projected/56142c27-a9f6-4617-ad00-0d1cd7416732-kube-api-access-cwvxr\") on node \"crc\" DevicePath \"\"" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.746233 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-vzzmp" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.760419 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-n5bwd" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.928875 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:45:04 crc kubenswrapper[4720]: I0121 14:45:04.932031 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-689zh" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.013474 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-2clln" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.178456 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" event={"ID":"56142c27-a9f6-4617-ad00-0d1cd7416732","Type":"ContainerDied","Data":"dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e"} Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.178527 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbbdc6cd4df54810bc28de491c3502b247e8ff570875b1b67219043c80a4af0e" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.178475 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483445-s84lg" Jan 21 14:45:05 crc kubenswrapper[4720]: I0121 14:45:05.299126 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-8hrkh" Jan 21 14:45:05 crc kubenswrapper[4720]: E0121 14:45:05.679345 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podUID="8db4bced-5679-43ab-a5c9-ba87574aaa02" Jan 21 14:45:06 crc kubenswrapper[4720]: I0121 14:45:06.065516 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-xtpbn" Jan 21 14:45:09 crc kubenswrapper[4720]: E0121 14:45:09.679872 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:244a4906353b84899db16a89e1ebb64491c9f85e69327cb2a72b6da0142a6e5e\\\"\"" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podUID="cd17e86c-5586-4ea9-979d-2c195494fe99" Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.213236 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" event={"ID":"085a2e93-1496-47f3-a7dc-4acae2e201fc","Type":"ContainerStarted","Data":"6e3ab235b1a036848d728d8093859634b696ed3cd06458985b0bd31ee1226158"} Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.213972 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.240620 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" podStartSLOduration=3.665598445 podStartE2EDuration="57.240597509s" podCreationTimestamp="2026-01-21 14:44:13 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.026386602 +0000 UTC m=+893.935126534" lastFinishedPulling="2026-01-21 14:45:09.601385666 +0000 UTC m=+947.510125598" observedRunningTime="2026-01-21 14:45:10.234608566 +0000 UTC m=+948.143348518" watchObservedRunningTime="2026-01-21 14:45:10.240597509 +0000 UTC m=+948.149337461" Jan 21 14:45:10 crc kubenswrapper[4720]: I0121 14:45:10.771529 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw" Jan 21 14:45:13 crc kubenswrapper[4720]: I0121 14:45:13.961174 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-gwlgm" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.240735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" event={"ID":"de2e9655-961c-4250-9852-332dfe335b4a","Type":"ContainerStarted","Data":"50e4ac158e03d1576d0b0021ba97e23ccd5940edf9ab87017ac66602fdea9014"} Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.241718 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.378420 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-54hwg" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.401193 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" podStartSLOduration=3.510243511 podStartE2EDuration="1m0.401161515s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.24340628 +0000 UTC m=+894.152146212" lastFinishedPulling="2026-01-21 14:45:13.134324284 +0000 UTC m=+951.043064216" observedRunningTime="2026-01-21 14:45:14.260982188 +0000 UTC m=+952.169722150" watchObservedRunningTime="2026-01-21 14:45:14.401161515 +0000 UTC m=+952.309901467" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.589529 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-d22bk" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.599229 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-glbt4" Jan 21 14:45:14 crc kubenswrapper[4720]: I0121 14:45:14.726078 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-pw4z6" Jan 21 14:45:16 crc kubenswrapper[4720]: I0121 14:45:16.256614 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" event={"ID":"a2557af5-c155-4d37-9b9a-f9335cac47b1","Type":"ContainerStarted","Data":"a98414e7ded5d377094a8e8b855ce704262508c6acaa7327778376dc0c31fd50"} Jan 21 14:45:16 crc kubenswrapper[4720]: I0121 14:45:16.257910 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:45:16 crc kubenswrapper[4720]: I0121 14:45:16.310133 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" podStartSLOduration=2.527852374 podStartE2EDuration="1m2.310111138s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.235134095 +0000 UTC m=+894.143874027" lastFinishedPulling="2026-01-21 14:45:16.017392859 +0000 UTC m=+953.926132791" observedRunningTime="2026-01-21 14:45:16.305374829 +0000 UTC m=+954.214114761" watchObservedRunningTime="2026-01-21 14:45:16.310111138 +0000 UTC m=+954.218851080" Jan 21 14:45:19 crc kubenswrapper[4720]: I0121 14:45:19.278796 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" event={"ID":"8db4bced-5679-43ab-a5c9-ba87574aaa02","Type":"ContainerStarted","Data":"28813b3446ff07ba2908d4dde7287b08c5bec53016fc7df8edc5b7f7b4820c22"} Jan 21 14:45:19 crc kubenswrapper[4720]: I0121 14:45:19.295892 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mm7cg" podStartSLOduration=2.335237476 podStartE2EDuration="1m4.295871358s" podCreationTimestamp="2026-01-21 14:44:15 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.383304958 +0000 UTC m=+894.292044890" lastFinishedPulling="2026-01-21 14:45:18.34393883 +0000 UTC m=+956.252678772" observedRunningTime="2026-01-21 14:45:19.294166032 +0000 UTC m=+957.202906014" watchObservedRunningTime="2026-01-21 14:45:19.295871358 +0000 UTC m=+957.204611290" Jan 21 14:45:22 crc kubenswrapper[4720]: I0121 14:45:22.310083 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" event={"ID":"cd17e86c-5586-4ea9-979d-2c195494fe99","Type":"ContainerStarted","Data":"c3e105273f6dfcd33639148919e2dc67f7d144de9f53263b410bebaf143572db"} Jan 21 14:45:23 crc kubenswrapper[4720]: I0121 14:45:23.318207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:45:23 crc kubenswrapper[4720]: I0121 14:45:23.339211 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" podStartSLOduration=3.576139275 podStartE2EDuration="1m9.339192002s" podCreationTimestamp="2026-01-21 14:44:14 +0000 UTC" firstStartedPulling="2026-01-21 14:44:16.363291774 +0000 UTC m=+894.272031706" lastFinishedPulling="2026-01-21 14:45:22.126344501 +0000 UTC m=+960.035084433" observedRunningTime="2026-01-21 14:45:23.333884937 +0000 UTC m=+961.242624879" watchObservedRunningTime="2026-01-21 14:45:23.339192002 +0000 UTC m=+961.247931944" Jan 21 14:45:24 crc kubenswrapper[4720]: I0121 14:45:24.989386 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-4tjlt" Jan 21 14:45:25 crc kubenswrapper[4720]: I0121 14:45:25.263615 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-jfkfq" Jan 21 14:45:35 crc kubenswrapper[4720]: I0121 14:45:35.220760 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7cd8bc9dbb-xczlv" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.708795 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:52 crc kubenswrapper[4720]: E0121 14:45:52.709521 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerName="collect-profiles" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.709533 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerName="collect-profiles" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.709682 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="56142c27-a9f6-4617-ad00-0d1cd7416732" containerName="collect-profiles" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.710496 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.719579 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7pv8k" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.719800 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.721027 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.721143 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.734729 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.805116 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.806694 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807765 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807873 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807896 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.807930 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.808866 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.821618 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.880399 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.880451 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908823 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908899 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908937 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.908975 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.909012 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.909995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.910039 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.910148 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.931851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"dnsmasq-dns-675f4bcbfc-lvnwp\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:52 crc kubenswrapper[4720]: I0121 14:45:52.932157 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"dnsmasq-dns-78dd6ddcc-56stq\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.036621 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.127824 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.496244 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.542816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" event={"ID":"47e392d4-f48b-4079-afd3-a5d7fae209a8","Type":"ContainerStarted","Data":"e472827db9139a7f614c97e83facefb5442c631000200b49113eeb8a3e5f4be5"} Jan 21 14:45:53 crc kubenswrapper[4720]: I0121 14:45:53.598370 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:53 crc kubenswrapper[4720]: W0121 14:45:53.604627 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaa470a6_13e1_47a6_a036_d9a5bab976e6.slice/crio-66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d WatchSource:0}: Error finding container 66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d: Status 404 returned error can't find the container with id 66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d Jan 21 14:45:54 crc kubenswrapper[4720]: I0121 14:45:54.552273 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" event={"ID":"baa470a6-13e1-47a6-a036-d9a5bab976e6","Type":"ContainerStarted","Data":"66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d"} Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.635556 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.674029 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.684163 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.696688 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.752899 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.753270 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.753299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.854508 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.854552 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.854643 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.855586 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.855929 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.888746 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"dnsmasq-dns-666b6646f7-l76kh\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:55 crc kubenswrapper[4720]: I0121 14:45:55.991562 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.012341 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.021569 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.026034 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.042899 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.165258 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.165412 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.165937 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.267121 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.267497 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.267594 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.268560 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.268851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.300438 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"dnsmasq-dns-57d769cc4f-nbfkp\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.370777 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.589250 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:45:56 crc kubenswrapper[4720]: W0121 14:45:56.602764 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b076fbb_9c67_4e19_a9e6_1acb75a52cb8.slice/crio-595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871 WatchSource:0}: Error finding container 595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871: Status 404 returned error can't find the container with id 595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871 Jan 21 14:45:56 crc kubenswrapper[4720]: I0121 14:45:56.875006 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.087555 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.088676 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095323 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095483 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095622 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095644 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095847 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.095917 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qrxkj" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.105461 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.120888 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.235172 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.236337 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.248581 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.248949 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d7vj6" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.249130 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.249295 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.249452 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.250345 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.257291 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.270317 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283375 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283433 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283507 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283545 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283594 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283684 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283769 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283848 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.283883 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386505 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386560 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386599 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386724 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386788 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386820 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386852 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.386977 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387077 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387146 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387247 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387297 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387314 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387358 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387381 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387401 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387426 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387472 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387876 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.387968 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.389321 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.389606 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.391024 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.408145 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.408356 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.408490 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.414589 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.425443 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.429957 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.443138 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489594 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489679 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489702 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489760 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489790 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489843 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489872 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489926 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.489948 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.491047 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.491572 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.491672 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.492366 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.493427 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.493802 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.494544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.495088 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.500896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.505384 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.523530 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.570043 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.604847 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerStarted","Data":"595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871"} Jan 21 14:45:57 crc kubenswrapper[4720]: I0121 14:45:57.868160 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.373136 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.383313 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.383451 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.392989 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bn29f" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.393264 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.393454 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.393858 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.396885 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507930 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507950 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtf9\" (UniqueName: \"kubernetes.io/projected/ab11441b-6bc4-4883-8a1e-866b31b425e9-kube-api-access-shtf9\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507976 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.507995 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.508284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.508407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609622 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609724 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609781 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shtf9\" (UniqueName: \"kubernetes.io/projected/ab11441b-6bc4-4883-8a1e-866b31b425e9-kube-api-access-shtf9\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609796 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609813 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.609878 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.610224 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.610650 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.611088 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-config-data-default\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.612437 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.613013 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab11441b-6bc4-4883-8a1e-866b31b425e9-kolla-config\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.618936 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.619995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab11441b-6bc4-4883-8a1e-866b31b425e9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.634995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.638936 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtf9\" (UniqueName: \"kubernetes.io/projected/ab11441b-6bc4-4883-8a1e-866b31b425e9-kube-api-access-shtf9\") pod \"openstack-galera-0\" (UID: \"ab11441b-6bc4-4883-8a1e-866b31b425e9\") " pod="openstack/openstack-galera-0" Jan 21 14:45:58 crc kubenswrapper[4720]: I0121 14:45:58.710938 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.555650 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.556993 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.562971 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.565266 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.566393 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hb7pt" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.563218 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.573139 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629110 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629150 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629174 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629192 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629518 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629536 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gljtn\" (UniqueName: \"kubernetes.io/projected/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kube-api-access-gljtn\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629557 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.629571 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731237 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731293 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731345 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731370 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731452 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731494 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gljtn\" (UniqueName: \"kubernetes.io/projected/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kube-api-access-gljtn\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731527 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.731573 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.732172 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.732317 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.735959 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.736385 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.736681 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.744125 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.747368 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.762800 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gljtn\" (UniqueName: \"kubernetes.io/projected/8a6a2220-24c4-4a0b-b72e-848dbac6a14b-kube-api-access-gljtn\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.763256 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"8a6a2220-24c4-4a0b-b72e-848dbac6a14b\") " pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.895646 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.936324 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.937628 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.940804 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-pf64s" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.941401 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.942881 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 21 14:45:59 crc kubenswrapper[4720]: I0121 14:45:59.951324 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036137 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8gwp\" (UniqueName: \"kubernetes.io/projected/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kube-api-access-l8gwp\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036177 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036210 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kolla-config\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036229 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-config-data\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.036286 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137678 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8gwp\" (UniqueName: \"kubernetes.io/projected/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kube-api-access-l8gwp\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137717 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137750 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kolla-config\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137767 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-config-data\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.137831 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.138787 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-config-data\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.138856 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kolla-config\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.145143 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.149111 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c29d26-d7a2-40b5-81b8-ffda85c198d3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.162279 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8gwp\" (UniqueName: \"kubernetes.io/projected/73c29d26-d7a2-40b5-81b8-ffda85c198d3-kube-api-access-l8gwp\") pod \"memcached-0\" (UID: \"73c29d26-d7a2-40b5-81b8-ffda85c198d3\") " pod="openstack/memcached-0" Jan 21 14:46:00 crc kubenswrapper[4720]: I0121 14:46:00.256939 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 21 14:46:01 crc kubenswrapper[4720]: W0121 14:46:01.272574 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bf7a9dc_02fc_4976_afd7_2e172728b008.slice/crio-6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d WatchSource:0}: Error finding container 6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d: Status 404 returned error can't find the container with id 6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.659615 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerStarted","Data":"6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d"} Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.733060 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.742044 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.745471 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.747714 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-tqvx9" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.870888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"kube-state-metrics-0\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " pod="openstack/kube-state-metrics-0" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.973367 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"kube-state-metrics-0\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " pod="openstack/kube-state-metrics-0" Jan 21 14:46:01 crc kubenswrapper[4720]: I0121 14:46:01.999215 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"kube-state-metrics-0\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " pod="openstack/kube-state-metrics-0" Jan 21 14:46:02 crc kubenswrapper[4720]: I0121 14:46:02.060368 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.953550 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wpvzs"] Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.957982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.966899 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.967061 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.967084 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9sf8m" Jan 21 14:46:04 crc kubenswrapper[4720]: I0121 14:46:04.969476 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs"] Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.003794 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-2v7f2"] Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.005303 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.032361 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2v7f2"] Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034465 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nspjf\" (UniqueName: \"kubernetes.io/projected/95379233-3cd8-4dd3-bf0f-b8198f2258e1-kube-api-access-nspjf\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034506 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-ovn-controller-tls-certs\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034536 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-log-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034574 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-combined-ca-bundle\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034607 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034626 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.034652 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95379233-3cd8-4dd3-bf0f-b8198f2258e1-scripts\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136526 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-ovn-controller-tls-certs\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136597 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-lib\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136629 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-log-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136682 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-log\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-combined-ca-bundle\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136748 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-etc-ovs\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136787 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04da7387-73aa-43e0-b547-7ce56e71d865-scripts\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136813 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136836 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136870 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95379233-3cd8-4dd3-bf0f-b8198f2258e1-scripts\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.136902 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzng\" (UniqueName: \"kubernetes.io/projected/04da7387-73aa-43e0-b547-7ce56e71d865-kube-api-access-pbzng\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137006 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nspjf\" (UniqueName: \"kubernetes.io/projected/95379233-3cd8-4dd3-bf0f-b8198f2258e1-kube-api-access-nspjf\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137035 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-run\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137317 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137340 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-run\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.137956 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/95379233-3cd8-4dd3-bf0f-b8198f2258e1-var-log-ovn\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.140899 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95379233-3cd8-4dd3-bf0f-b8198f2258e1-scripts\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.155286 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-combined-ca-bundle\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.158046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/95379233-3cd8-4dd3-bf0f-b8198f2258e1-ovn-controller-tls-certs\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.158216 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nspjf\" (UniqueName: \"kubernetes.io/projected/95379233-3cd8-4dd3-bf0f-b8198f2258e1-kube-api-access-nspjf\") pod \"ovn-controller-wpvzs\" (UID: \"95379233-3cd8-4dd3-bf0f-b8198f2258e1\") " pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240516 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-run\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240586 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-lib\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240617 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-log\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240678 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-etc-ovs\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240727 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04da7387-73aa-43e0-b547-7ce56e71d865-scripts\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.240782 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzng\" (UniqueName: \"kubernetes.io/projected/04da7387-73aa-43e0-b547-7ce56e71d865-kube-api-access-pbzng\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243274 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-log\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243350 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-etc-ovs\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243378 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-run\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.243708 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/04da7387-73aa-43e0-b547-7ce56e71d865-var-lib\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.245751 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04da7387-73aa-43e0-b547-7ce56e71d865-scripts\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.263395 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzng\" (UniqueName: \"kubernetes.io/projected/04da7387-73aa-43e0-b547-7ce56e71d865-kube-api-access-pbzng\") pod \"ovn-controller-ovs-2v7f2\" (UID: \"04da7387-73aa-43e0-b547-7ce56e71d865\") " pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.305797 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:05 crc kubenswrapper[4720]: I0121 14:46:05.328669 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.450250 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.451980 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458004 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458177 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-x8xgt" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458357 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458530 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 21 14:46:06 crc kubenswrapper[4720]: I0121 14:46:06.458764 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.074800 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.094882 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.094945 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvqs\" (UniqueName: \"kubernetes.io/projected/e8cf4740-b779-4759-92d1-22ce3e5f1369-kube-api-access-lpvqs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095021 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095077 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095106 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095134 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095211 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.095287 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196468 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196766 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvqs\" (UniqueName: \"kubernetes.io/projected/e8cf4740-b779-4759-92d1-22ce3e5f1369-kube-api-access-lpvqs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.196937 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197146 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197279 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197316 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197356 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197391 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.197862 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.198198 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-config\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.198970 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8cf4740-b779-4759-92d1-22ce3e5f1369-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.202993 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.203807 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.204839 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e8cf4740-b779-4759-92d1-22ce3e5f1369-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.219613 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.220425 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvqs\" (UniqueName: \"kubernetes.io/projected/e8cf4740-b779-4759-92d1-22ce3e5f1369-kube-api-access-lpvqs\") pod \"ovsdbserver-nb-0\" (UID: \"e8cf4740-b779-4759-92d1-22ce3e5f1369\") " pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:07 crc kubenswrapper[4720]: I0121 14:46:07.366872 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:09 crc kubenswrapper[4720]: I0121 14:46:09.195708 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.123453 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.125228 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.129313 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8q9gq" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.130623 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.130930 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.132068 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.138699 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267360 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267446 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267493 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267556 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267592 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267620 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267673 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.267725 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnr5\" (UniqueName: \"kubernetes.io/projected/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-kube-api-access-plnr5\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368694 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368755 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368805 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368833 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368894 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368923 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnr5\" (UniqueName: \"kubernetes.io/projected/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-kube-api-access-plnr5\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.368955 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.369298 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.370026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.370402 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.370552 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.375075 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.375280 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.377000 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.390925 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.392006 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnr5\" (UniqueName: \"kubernetes.io/projected/4b833ac6-f279-4dfb-84fb-22b531e6b7ef-kube-api-access-plnr5\") pod \"ovsdbserver-sb-0\" (UID: \"4b833ac6-f279-4dfb-84fb-22b531e6b7ef\") " pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:10 crc kubenswrapper[4720]: I0121 14:46:10.460227 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:22 crc kubenswrapper[4720]: I0121 14:46:22.880342 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:46:22 crc kubenswrapper[4720]: I0121 14:46:22.881520 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.176352 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.177031 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gfjd2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-l76kh_openstack(7b076fbb-9c67-4e19-a9e6-1acb75a52cb8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.178372 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" Jan 21 14:46:29 crc kubenswrapper[4720]: I0121 14:46:29.209736 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerStarted","Data":"348934cdbf75477f1ab960f3f1053dff6dbf9d2daa8c4387234ea6851e521a6d"} Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.212806 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.235180 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.235543 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kg5kp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-56stq_openstack(baa470a6-13e1-47a6-a036-d9a5bab976e6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.237275 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" podUID="baa470a6-13e1-47a6-a036-d9a5bab976e6" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.292726 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.292869 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t64nn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-nbfkp_openstack(7bf7a9dc-02fc-4976-afd7-2e172728b008): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.294460 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.340851 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.341019 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfgg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-lvnwp_openstack(47e392d4-f48b-4079-afd3-a5d7fae209a8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:46:29 crc kubenswrapper[4720]: E0121 14:46:29.342318 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" podUID="47e392d4-f48b-4079-afd3-a5d7fae209a8" Jan 21 14:46:29 crc kubenswrapper[4720]: I0121 14:46:29.993216 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.018574 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.030202 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: W0121 14:46:30.042211 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a6a2220_24c4_4a0b_b72e_848dbac6a14b.slice/crio-2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e WatchSource:0}: Error finding container 2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e: Status 404 returned error can't find the container with id 2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.216264 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs" event={"ID":"95379233-3cd8-4dd3-bf0f-b8198f2258e1","Type":"ContainerStarted","Data":"14901142bbe0cf9c3f5f739a21ffe8acf3688a095725823833475bf0b7a521cc"} Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.217646 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerStarted","Data":"2abcef7bbdeb906aad91b06678021ef36456c0d3c8dd5748e9e075ac93d7913e"} Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.218822 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"73c29d26-d7a2-40b5-81b8-ffda85c198d3","Type":"ContainerStarted","Data":"757461f9b0b16f1b1d3b7636456ed819d453bc36911002c98bf376775e84071e"} Jan 21 14:46:30 crc kubenswrapper[4720]: E0121 14:46:30.221191 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.325948 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.333951 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.341207 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:46:30 crc kubenswrapper[4720]: I0121 14:46:30.596750 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.113588 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-2v7f2"] Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.233817 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerStarted","Data":"6a3244b72bc0d2e8692db27fe44256e6ab79a8541de8947798ad30865a4efc75"} Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.235799 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerStarted","Data":"b5de03c99a86e921243af3619119b73c952c5f3ccc688bb6fd4a69b6fda32dd9"} Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.236956 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerStarted","Data":"da6b6b430f12d2b56cf212530b8e484bf3b8d0da1c76e1f2c9cac8d57f6efdf2"} Jan 21 14:46:31 crc kubenswrapper[4720]: I0121 14:46:31.323927 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 21 14:46:32 crc kubenswrapper[4720]: W0121 14:46:32.352864 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8cf4740_b779_4759_92d1_22ce3e5f1369.slice/crio-feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9 WatchSource:0}: Error finding container feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9: Status 404 returned error can't find the container with id feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9 Jan 21 14:46:32 crc kubenswrapper[4720]: W0121 14:46:32.357019 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b833ac6_f279_4dfb_84fb_22b531e6b7ef.slice/crio-059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc WatchSource:0}: Error finding container 059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc: Status 404 returned error can't find the container with id 059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.426437 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.438976 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.540099 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") pod \"baa470a6-13e1-47a6-a036-d9a5bab976e6\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.540181 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") pod \"baa470a6-13e1-47a6-a036-d9a5bab976e6\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.540251 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") pod \"baa470a6-13e1-47a6-a036-d9a5bab976e6\" (UID: \"baa470a6-13e1-47a6-a036-d9a5bab976e6\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.541115 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "baa470a6-13e1-47a6-a036-d9a5bab976e6" (UID: "baa470a6-13e1-47a6-a036-d9a5bab976e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.541566 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config" (OuterVolumeSpecName: "config") pod "baa470a6-13e1-47a6-a036-d9a5bab976e6" (UID: "baa470a6-13e1-47a6-a036-d9a5bab976e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.546823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp" (OuterVolumeSpecName: "kube-api-access-kg5kp") pod "baa470a6-13e1-47a6-a036-d9a5bab976e6" (UID: "baa470a6-13e1-47a6-a036-d9a5bab976e6"). InnerVolumeSpecName "kube-api-access-kg5kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642063 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") pod \"47e392d4-f48b-4079-afd3-a5d7fae209a8\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642249 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") pod \"47e392d4-f48b-4079-afd3-a5d7fae209a8\" (UID: \"47e392d4-f48b-4079-afd3-a5d7fae209a8\") " Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642532 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642547 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg5kp\" (UniqueName: \"kubernetes.io/projected/baa470a6-13e1-47a6-a036-d9a5bab976e6-kube-api-access-kg5kp\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642558 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/baa470a6-13e1-47a6-a036-d9a5bab976e6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.642960 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config" (OuterVolumeSpecName: "config") pod "47e392d4-f48b-4079-afd3-a5d7fae209a8" (UID: "47e392d4-f48b-4079-afd3-a5d7fae209a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.658618 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5" (OuterVolumeSpecName: "kube-api-access-pfgg5") pod "47e392d4-f48b-4079-afd3-a5d7fae209a8" (UID: "47e392d4-f48b-4079-afd3-a5d7fae209a8"). InnerVolumeSpecName "kube-api-access-pfgg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.744477 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfgg5\" (UniqueName: \"kubernetes.io/projected/47e392d4-f48b-4079-afd3-a5d7fae209a8-kube-api-access-pfgg5\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:32 crc kubenswrapper[4720]: I0121 14:46:32.744510 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47e392d4-f48b-4079-afd3-a5d7fae209a8-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.280305 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerStarted","Data":"354eb4da79f832f710738ebccd702c93cde8a2bb019f171edf279baa7729d9ce"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.284314 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b833ac6-f279-4dfb-84fb-22b531e6b7ef","Type":"ContainerStarted","Data":"059be9a8e3e4e5cf81487a724d9df0847e2d6ddc23e8f512ca0fe2e54a3eeecc"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.287050 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8cf4740-b779-4759-92d1-22ce3e5f1369","Type":"ContainerStarted","Data":"feffe54f31a83065155ac0e2ce33fa53a57290884e776615eb2cca82e911bee9"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.288234 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" event={"ID":"47e392d4-f48b-4079-afd3-a5d7fae209a8","Type":"ContainerDied","Data":"e472827db9139a7f614c97e83facefb5442c631000200b49113eeb8a3e5f4be5"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.288291 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lvnwp" Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.290147 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" event={"ID":"baa470a6-13e1-47a6-a036-d9a5bab976e6","Type":"ContainerDied","Data":"66ff83cf99122294432f39dacae5d6e428cce0897ec84f2469a57dde751a1a7d"} Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.290225 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-56stq" Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.340066 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.344449 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lvnwp"] Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.371791 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:46:33 crc kubenswrapper[4720]: I0121 14:46:33.378092 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-56stq"] Jan 21 14:46:34 crc kubenswrapper[4720]: I0121 14:46:34.688478 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e392d4-f48b-4079-afd3-a5d7fae209a8" path="/var/lib/kubelet/pods/47e392d4-f48b-4079-afd3-a5d7fae209a8/volumes" Jan 21 14:46:34 crc kubenswrapper[4720]: I0121 14:46:34.689303 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa470a6-13e1-47a6-a036-d9a5bab976e6" path="/var/lib/kubelet/pods/baa470a6-13e1-47a6-a036-d9a5bab976e6/volumes" Jan 21 14:46:38 crc kubenswrapper[4720]: I0121 14:46:38.330274 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"73c29d26-d7a2-40b5-81b8-ffda85c198d3","Type":"ContainerStarted","Data":"9d9afaeb9b65beb101a57bd12820454808396b6e2e005a99a7bcaadb7bb3e1ee"} Jan 21 14:46:38 crc kubenswrapper[4720]: I0121 14:46:38.330830 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 21 14:46:38 crc kubenswrapper[4720]: I0121 14:46:38.358042 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=31.466588956 podStartE2EDuration="39.358021989s" podCreationTimestamp="2026-01-21 14:45:59 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.035228824 +0000 UTC m=+1027.943968756" lastFinishedPulling="2026-01-21 14:46:37.926661857 +0000 UTC m=+1035.835401789" observedRunningTime="2026-01-21 14:46:38.353461605 +0000 UTC m=+1036.262201527" watchObservedRunningTime="2026-01-21 14:46:38.358021989 +0000 UTC m=+1036.266761931" Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.341877 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerStarted","Data":"c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.345540 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerStarted","Data":"6993f2212b44bdd39bce66195a79454199b52176f7d2a859e4057fd31875db5b"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.347643 4720 generic.go:334] "Generic (PLEG): container finished" podID="04da7387-73aa-43e0-b547-7ce56e71d865" containerID="a5022c1f1b525d33e238d4482c9892406af087321aad8bd0ab2a7ea73cd4e288" exitCode=0 Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.347713 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerDied","Data":"a5022c1f1b525d33e238d4482c9892406af087321aad8bd0ab2a7ea73cd4e288"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.350545 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8cf4740-b779-4759-92d1-22ce3e5f1369","Type":"ContainerStarted","Data":"0b1e13a703fb2f04a9edb1e5f91e4500fd7bd28dfa40a775132461a8b5680c63"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.352825 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerStarted","Data":"c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.355067 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerStarted","Data":"35d070f6a12774abaa5a565105029112dff39f2c0ed4e97e33f5220ea3b359c1"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.357216 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs" event={"ID":"95379233-3cd8-4dd3-bf0f-b8198f2258e1","Type":"ContainerStarted","Data":"4fbf8c1ee36ba5d1aef5429cac91c5f160551d648190b4fdd659ba7ceb48ae56"} Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.357244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wpvzs" Jan 21 14:46:39 crc kubenswrapper[4720]: I0121 14:46:39.391146 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wpvzs" podStartSLOduration=27.41627989 podStartE2EDuration="35.391125688s" podCreationTimestamp="2026-01-21 14:46:04 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.041396132 +0000 UTC m=+1027.950136064" lastFinishedPulling="2026-01-21 14:46:38.01624193 +0000 UTC m=+1035.924981862" observedRunningTime="2026-01-21 14:46:39.383287745 +0000 UTC m=+1037.292027697" watchObservedRunningTime="2026-01-21 14:46:39.391125688 +0000 UTC m=+1037.299865620" Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.365612 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerStarted","Data":"128dc883c1cb54d300554c0a1f9c402d25a7b42a58cf0cd66f72035cd7595489"} Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.368607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b833ac6-f279-4dfb-84fb-22b531e6b7ef","Type":"ContainerStarted","Data":"c83330103379edd07e7e941627350e72fc565a47bf701190f1364a9ace4bad2d"} Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.370926 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerStarted","Data":"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92"} Jan 21 14:46:40 crc kubenswrapper[4720]: I0121 14:46:40.397337 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=29.703317961 podStartE2EDuration="39.397316024s" podCreationTimestamp="2026-01-21 14:46:01 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.322866577 +0000 UTC m=+1028.231606509" lastFinishedPulling="2026-01-21 14:46:40.01686464 +0000 UTC m=+1037.925604572" observedRunningTime="2026-01-21 14:46:40.386954242 +0000 UTC m=+1038.295694204" watchObservedRunningTime="2026-01-21 14:46:40.397316024 +0000 UTC m=+1038.306055956" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.381818 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-2v7f2" event={"ID":"04da7387-73aa-43e0-b547-7ce56e71d865","Type":"ContainerStarted","Data":"39abff7ca05557b976eb4f12ae00164a0fe932683d17f8b0b89f7c43d6852f4d"} Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.381941 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.382250 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.382345 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:46:41 crc kubenswrapper[4720]: I0121 14:46:41.405085 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-2v7f2" podStartSLOduration=31.734498826 podStartE2EDuration="37.405069703s" podCreationTimestamp="2026-01-21 14:46:04 +0000 UTC" firstStartedPulling="2026-01-21 14:46:32.353453435 +0000 UTC m=+1030.262193367" lastFinishedPulling="2026-01-21 14:46:38.024024312 +0000 UTC m=+1035.932764244" observedRunningTime="2026-01-21 14:46:41.402828781 +0000 UTC m=+1039.311568733" watchObservedRunningTime="2026-01-21 14:46:41.405069703 +0000 UTC m=+1039.313809635" Jan 21 14:46:44 crc kubenswrapper[4720]: I0121 14:46:44.407283 4720 generic.go:334] "Generic (PLEG): container finished" podID="8a6a2220-24c4-4a0b-b72e-848dbac6a14b" containerID="6993f2212b44bdd39bce66195a79454199b52176f7d2a859e4057fd31875db5b" exitCode=0 Jan 21 14:46:44 crc kubenswrapper[4720]: I0121 14:46:44.407402 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerDied","Data":"6993f2212b44bdd39bce66195a79454199b52176f7d2a859e4057fd31875db5b"} Jan 21 14:46:45 crc kubenswrapper[4720]: I0121 14:46:45.260970 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 21 14:46:45 crc kubenswrapper[4720]: I0121 14:46:45.429511 4720 generic.go:334] "Generic (PLEG): container finished" podID="ab11441b-6bc4-4883-8a1e-866b31b425e9" containerID="35d070f6a12774abaa5a565105029112dff39f2c0ed4e97e33f5220ea3b359c1" exitCode=0 Jan 21 14:46:45 crc kubenswrapper[4720]: I0121 14:46:45.430184 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerDied","Data":"35d070f6a12774abaa5a565105029112dff39f2c0ed4e97e33f5220ea3b359c1"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.439021 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"e8cf4740-b779-4759-92d1-22ce3e5f1369","Type":"ContainerStarted","Data":"ef22f61d71a363f2789f831afd2fde0d4f527560248ef9949ba2ef8cbf2285f9"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.442114 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ab11441b-6bc4-4883-8a1e-866b31b425e9","Type":"ContainerStarted","Data":"f6a741dcb96fee8bc1a243c1147923a6ae85456e0394f00facd1588d9dab71be"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.444502 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerID="0b786963c4e47855e06d100dbf56418f2e0998a3b22a8c8c54b9832d191b6f8a" exitCode=0 Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.444536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerDied","Data":"0b786963c4e47855e06d100dbf56418f2e0998a3b22a8c8c54b9832d191b6f8a"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.448520 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8a6a2220-24c4-4a0b-b72e-848dbac6a14b","Type":"ContainerStarted","Data":"72be484776fdc013ccba2e85aaac2161a9ffdf86ec968878165385a82bab219e"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.450617 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b833ac6-f279-4dfb-84fb-22b531e6b7ef","Type":"ContainerStarted","Data":"dd2b0be046d44b565e237f62a26a48c9187b1871918bf42941523918804b0975"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.452566 4720 generic.go:334] "Generic (PLEG): container finished" podID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerID="483ca902ed4e5047641613f847b35ef340ea8ec0c446f87587d5090cf42d9a5f" exitCode=0 Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.452624 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerDied","Data":"483ca902ed4e5047641613f847b35ef340ea8ec0c446f87587d5090cf42d9a5f"} Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.461084 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.476533 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=28.446174559 podStartE2EDuration="41.476513535s" podCreationTimestamp="2026-01-21 14:46:05 +0000 UTC" firstStartedPulling="2026-01-21 14:46:32.361052771 +0000 UTC m=+1030.269792713" lastFinishedPulling="2026-01-21 14:46:45.391391757 +0000 UTC m=+1043.300131689" observedRunningTime="2026-01-21 14:46:46.472365472 +0000 UTC m=+1044.381105414" watchObservedRunningTime="2026-01-21 14:46:46.476513535 +0000 UTC m=+1044.385253477" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.518347 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.519808 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=41.820754639 podStartE2EDuration="49.519791585s" podCreationTimestamp="2026-01-21 14:45:57 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.325785827 +0000 UTC m=+1028.234525759" lastFinishedPulling="2026-01-21 14:46:38.024822753 +0000 UTC m=+1035.933562705" observedRunningTime="2026-01-21 14:46:46.518097168 +0000 UTC m=+1044.426837110" watchObservedRunningTime="2026-01-21 14:46:46.519791585 +0000 UTC m=+1044.428531517" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.579991 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=40.701506916 podStartE2EDuration="48.579968826s" podCreationTimestamp="2026-01-21 14:45:58 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.046186652 +0000 UTC m=+1027.954926584" lastFinishedPulling="2026-01-21 14:46:37.924648562 +0000 UTC m=+1035.833388494" observedRunningTime="2026-01-21 14:46:46.578300499 +0000 UTC m=+1044.487040441" watchObservedRunningTime="2026-01-21 14:46:46.579968826 +0000 UTC m=+1044.488708778" Jan 21 14:46:46 crc kubenswrapper[4720]: I0121 14:46:46.600732 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.622010803 podStartE2EDuration="37.600713411s" podCreationTimestamp="2026-01-21 14:46:09 +0000 UTC" firstStartedPulling="2026-01-21 14:46:32.360975559 +0000 UTC m=+1030.269715491" lastFinishedPulling="2026-01-21 14:46:45.339678167 +0000 UTC m=+1043.248418099" observedRunningTime="2026-01-21 14:46:46.594751508 +0000 UTC m=+1044.503491450" watchObservedRunningTime="2026-01-21 14:46:46.600713411 +0000 UTC m=+1044.509453353" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.367481 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.462960 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerStarted","Data":"9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b"} Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.463171 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.465318 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerStarted","Data":"a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285"} Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.465833 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.491605 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podStartSLOduration=3.692035401 podStartE2EDuration="52.491582903s" podCreationTimestamp="2026-01-21 14:45:55 +0000 UTC" firstStartedPulling="2026-01-21 14:45:56.606488604 +0000 UTC m=+994.515228536" lastFinishedPulling="2026-01-21 14:46:45.406036106 +0000 UTC m=+1043.314776038" observedRunningTime="2026-01-21 14:46:47.483939734 +0000 UTC m=+1045.392679686" watchObservedRunningTime="2026-01-21 14:46:47.491582903 +0000 UTC m=+1045.400322855" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.501081 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podStartSLOduration=7.37241547 podStartE2EDuration="51.501062731s" podCreationTimestamp="2026-01-21 14:45:56 +0000 UTC" firstStartedPulling="2026-01-21 14:46:01.277242991 +0000 UTC m=+999.185982923" lastFinishedPulling="2026-01-21 14:46:45.405890252 +0000 UTC m=+1043.314630184" observedRunningTime="2026-01-21 14:46:47.498673496 +0000 UTC m=+1045.407413428" watchObservedRunningTime="2026-01-21 14:46:47.501062731 +0000 UTC m=+1045.409802663" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.515858 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.797000 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.826274 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.827702 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.830021 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.840499 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.922557 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-h55pf"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.926122 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.931968 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.945540 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h55pf"] Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.946760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.946885 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.946983 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:47 crc kubenswrapper[4720]: I0121 14:46:47.947097 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.048957 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brszh\" (UniqueName: \"kubernetes.io/projected/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-kube-api-access-brszh\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049063 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovs-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049167 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-combined-ca-bundle\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049208 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049288 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049325 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049365 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-config\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049390 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovn-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049420 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.049560 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.050409 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.050453 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.051039 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.068296 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"dnsmasq-dns-7f896c8c65-4xc4n\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.107175 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.137195 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.138335 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.142945 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.147219 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155544 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brszh\" (UniqueName: \"kubernetes.io/projected/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-kube-api-access-brszh\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155816 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovs-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155834 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-combined-ca-bundle\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155895 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155917 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-config\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.155937 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovn-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.156162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovn-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.156164 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-ovs-rundir\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.156640 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-config\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.162123 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-combined-ca-bundle\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.171068 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.204337 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.212605 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brszh\" (UniqueName: \"kubernetes.io/projected/4fc0e40b-c337-42d2-87a3-2eedfa2f1a65-kube-api-access-brszh\") pod \"ovn-controller-metrics-h55pf\" (UID: \"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65\") " pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.256040 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h55pf" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257340 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257479 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257580 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257681 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.257760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.361948 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.362160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.362244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.367835 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.365892 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.373999 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.365109 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.374186 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.375058 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.389018 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"dnsmasq-dns-86db49b7ff-r6bj4\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.475079 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" containerID="cri-o://a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285" gracePeriod=10 Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.475923 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.586874 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.712540 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 21 14:46:48 crc kubenswrapper[4720]: I0121 14:46:48.712582 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.368136 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.412739 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.484469 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" containerID="cri-o://9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b" gracePeriod=10 Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.526301 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.682287 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.684040 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.689811 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.690076 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.690203 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mzpmv" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.693010 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.701011 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.808729 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809172 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809276 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-config\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809618 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-scripts\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809668 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8cp5\" (UniqueName: \"kubernetes.io/projected/262f8354-3f7b-483f-940d-8b0f394e344a-kube-api-access-q8cp5\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.809852 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.896478 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.896543 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911714 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911757 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-config\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-scripts\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911843 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8cp5\" (UniqueName: \"kubernetes.io/projected/262f8354-3f7b-483f-940d-8b0f394e344a-kube-api-access-q8cp5\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911934 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.911966 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.914327 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-config\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.914424 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/262f8354-3f7b-483f-940d-8b0f394e344a-scripts\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.914478 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.928485 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.931469 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.933032 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/262f8354-3f7b-483f-940d-8b0f394e344a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:49 crc kubenswrapper[4720]: I0121 14:46:49.934138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8cp5\" (UniqueName: \"kubernetes.io/projected/262f8354-3f7b-483f-940d-8b0f394e344a-kube-api-access-q8cp5\") pod \"ovn-northd-0\" (UID: \"262f8354-3f7b-483f-940d-8b0f394e344a\") " pod="openstack/ovn-northd-0" Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.009820 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.155461 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.253502 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.261951 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h55pf"] Jan 21 14:46:50 crc kubenswrapper[4720]: W0121 14:46:50.269478 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fc0e40b_c337_42d2_87a3_2eedfa2f1a65.slice/crio-6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd WatchSource:0}: Error finding container 6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd: Status 404 returned error can't find the container with id 6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.494854 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.500239 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerStarted","Data":"796d2687528fd87d25dd4fc1a5f89808d76b284cc0b9360ef63068e7663548e8"} Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.501485 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h55pf" event={"ID":"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65","Type":"ContainerStarted","Data":"6ac968bf36e1eff3fb9de52c1805b1bde6084d708798b0864aa7f2930cfcc2dd"} Jan 21 14:46:50 crc kubenswrapper[4720]: I0121 14:46:50.502983 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerStarted","Data":"5c2c6c7764adf382494a9f5fec6ff894d74ac4a2178d6e2c114761cba32aec98"} Jan 21 14:46:50 crc kubenswrapper[4720]: W0121 14:46:50.504212 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod262f8354_3f7b_483f_940d_8b0f394e344a.slice/crio-a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745 WatchSource:0}: Error finding container a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745: Status 404 returned error can't find the container with id a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.015014 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.97:5353: connect: connection refused" Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.374739 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.580403 4720 generic.go:334] "Generic (PLEG): container finished" podID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerID="a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.580516 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerDied","Data":"a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.598628 4720 generic.go:334] "Generic (PLEG): container finished" podID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerID="ce16ebb9a67a679cad4040701c2e535eabfd75f649979c91f4ea8e8bc1b64f6b" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.598720 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerDied","Data":"ce16ebb9a67a679cad4040701c2e535eabfd75f649979c91f4ea8e8bc1b64f6b"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.633900 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h55pf" event={"ID":"4fc0e40b-c337-42d2-87a3-2eedfa2f1a65","Type":"ContainerStarted","Data":"fa9f95f15aae289dbfc14f2cb29da9f352c35627be5ad7715617cf9c24a323e5"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.649474 4720 generic.go:334] "Generic (PLEG): container finished" podID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerID="9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.649532 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerDied","Data":"9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.650799 4720 generic.go:334] "Generic (PLEG): container finished" podID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" exitCode=0 Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.650831 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerDied","Data":"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.674249 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"262f8354-3f7b-483f-940d-8b0f394e344a","Type":"ContainerStarted","Data":"a411bc55d5a45113a5bfcdc112c0b9e66956bcec653b4665e16624b4de042745"} Jan 21 14:46:51 crc kubenswrapper[4720]: I0121 14:46:51.732982 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-h55pf" podStartSLOduration=4.732965662 podStartE2EDuration="4.732965662s" podCreationTimestamp="2026-01-21 14:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:51.690401701 +0000 UTC m=+1049.599141633" watchObservedRunningTime="2026-01-21 14:46:51.732965662 +0000 UTC m=+1049.641705594" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.074207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.284627 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.307299 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382704 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") pod \"7bf7a9dc-02fc-4976-afd7-2e172728b008\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382803 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") pod \"7bf7a9dc-02fc-4976-afd7-2e172728b008\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382863 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") pod \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.382883 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") pod \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.383473 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") pod \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\" (UID: \"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.383576 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") pod \"7bf7a9dc-02fc-4976-afd7-2e172728b008\" (UID: \"7bf7a9dc-02fc-4976-afd7-2e172728b008\") " Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.397352 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2" (OuterVolumeSpecName: "kube-api-access-gfjd2") pod "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" (UID: "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8"). InnerVolumeSpecName "kube-api-access-gfjd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.397637 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn" (OuterVolumeSpecName: "kube-api-access-t64nn") pod "7bf7a9dc-02fc-4976-afd7-2e172728b008" (UID: "7bf7a9dc-02fc-4976-afd7-2e172728b008"). InnerVolumeSpecName "kube-api-access-t64nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.438615 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7bf7a9dc-02fc-4976-afd7-2e172728b008" (UID: "7bf7a9dc-02fc-4976-afd7-2e172728b008"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.443510 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config" (OuterVolumeSpecName: "config") pod "7bf7a9dc-02fc-4976-afd7-2e172728b008" (UID: "7bf7a9dc-02fc-4976-afd7-2e172728b008"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.461313 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config" (OuterVolumeSpecName: "config") pod "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" (UID: "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.470204 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" (UID: "7b076fbb-9c67-4e19-a9e6-1acb75a52cb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486830 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t64nn\" (UniqueName: \"kubernetes.io/projected/7bf7a9dc-02fc-4976-afd7-2e172728b008-kube-api-access-t64nn\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486870 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486880 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486888 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486898 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfjd2\" (UniqueName: \"kubernetes.io/projected/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8-kube-api-access-gfjd2\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.486905 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bf7a9dc-02fc-4976-afd7-2e172728b008-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.685098 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.690480 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.696686 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-nbfkp" event={"ID":"7bf7a9dc-02fc-4976-afd7-2e172728b008","Type":"ContainerDied","Data":"6650a88dae698abc02e358a7f68ed4173c431f0402a2eee9296a8d8cc7459c5d"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.696722 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerStarted","Data":"2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.696771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-l76kh" event={"ID":"7b076fbb-9c67-4e19-a9e6-1acb75a52cb8","Type":"ContainerDied","Data":"595d585b971e43804af671995b49cfdd137a2f109ac81b7c395b8867814a6871"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697725 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697791 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697811 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerStarted","Data":"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196"} Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.697920 4720 scope.go:117] "RemoveContainer" containerID="a6a9c62b53ee9551c9149db01fa9be1bd5bffd59ecb9c1581fb71dba03034285" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.730397 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" podStartSLOduration=5.730380808 podStartE2EDuration="5.730380808s" podCreationTimestamp="2026-01-21 14:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:52.729451043 +0000 UTC m=+1050.638190995" watchObservedRunningTime="2026-01-21 14:46:52.730380808 +0000 UTC m=+1050.639120750" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.752400 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.754564 4720 scope.go:117] "RemoveContainer" containerID="0b786963c4e47855e06d100dbf56418f2e0998a3b22a8c8c54b9832d191b6f8a" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.778345 4720 scope.go:117] "RemoveContainer" containerID="9fc576eecbdb3fedfe8e2d5fa8d5c27836b60d3c3256b4bc86a99f02229d670b" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.802909 4720 scope.go:117] "RemoveContainer" containerID="483ca902ed4e5047641613f847b35ef340ea8ec0c446f87587d5090cf42d9a5f" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.804356 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-nbfkp"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.806106 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" podStartSLOduration=4.806091402 podStartE2EDuration="4.806091402s" podCreationTimestamp="2026-01-21 14:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:52.765114485 +0000 UTC m=+1050.673854427" watchObservedRunningTime="2026-01-21 14:46:52.806091402 +0000 UTC m=+1050.714831324" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.818092 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.823863 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-l76kh"] Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881127 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881174 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881213 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881771 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:46:52 crc kubenswrapper[4720]: I0121 14:46:52.881821 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6" gracePeriod=600 Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707384 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6" exitCode=0 Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707445 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707737 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.707757 4720 scope.go:117] "RemoveContainer" containerID="75aaa3118f909741ad221a6d4a71b9c6e4e33b0de93fc4cf721b556711ea2c47" Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.714215 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"262f8354-3f7b-483f-940d-8b0f394e344a","Type":"ContainerStarted","Data":"815adb0d9349b00edb50d5383a97b04d2ea58b47183d7b8c549b2f06e6736cf2"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.714271 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"262f8354-3f7b-483f-940d-8b0f394e344a","Type":"ContainerStarted","Data":"825c39cc72d95021f48701e28220e0cc1b60e6d64d4e16d91eca4ddce4e14ddf"} Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.714297 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 21 14:46:53 crc kubenswrapper[4720]: I0121 14:46:53.754395 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.505535751 podStartE2EDuration="4.75437442s" podCreationTimestamp="2026-01-21 14:46:49 +0000 UTC" firstStartedPulling="2026-01-21 14:46:50.506389007 +0000 UTC m=+1048.415128939" lastFinishedPulling="2026-01-21 14:46:52.755227676 +0000 UTC m=+1050.663967608" observedRunningTime="2026-01-21 14:46:53.746222317 +0000 UTC m=+1051.654962269" watchObservedRunningTime="2026-01-21 14:46:53.75437442 +0000 UTC m=+1051.663114362" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.533390 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.614403 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.696914 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" path="/var/lib/kubelet/pods/7b076fbb-9c67-4e19-a9e6-1acb75a52cb8/volumes" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.698179 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" path="/var/lib/kubelet/pods/7bf7a9dc-02fc-4976-afd7-2e172728b008/volumes" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.800586 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 21 14:46:54 crc kubenswrapper[4720]: I0121 14:46:54.870522 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.532463 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533058 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533139 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533208 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533272 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533337 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533387 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="init" Jan 21 14:46:55 crc kubenswrapper[4720]: E0121 14:46:55.533458 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533514 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533721 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf7a9dc-02fc-4976-afd7-2e172728b008" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.533797 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b076fbb-9c67-4e19-a9e6-1acb75a52cb8" containerName="dnsmasq-dns" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.534371 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.536715 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.541107 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.550818 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.569509 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.578401 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.637157 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.637429 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739081 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739171 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739206 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.739282 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.740084 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.759301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"glance-372a-account-create-update-w4xkf\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.841215 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.841446 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.843120 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.865451 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"glance-db-create-ckgkh\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.865717 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:55 crc kubenswrapper[4720]: I0121 14:46:55.874591 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.327395 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.469940 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:46:56 crc kubenswrapper[4720]: W0121 14:46:56.479948 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4bb55ed_9214_4f25_8740_ac50421baa4b.slice/crio-50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b WatchSource:0}: Error finding container 50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b: Status 404 returned error can't find the container with id 50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.743804 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerStarted","Data":"7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.744053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerStarted","Data":"50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.745770 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerStarted","Data":"3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.745829 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerStarted","Data":"bcd20e7f6ac82b5dcffae3b544849f7a8247b45941be3c501277dff7c8746f63"} Jan 21 14:46:56 crc kubenswrapper[4720]: I0121 14:46:56.774511 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-ckgkh" podStartSLOduration=1.774278123 podStartE2EDuration="1.774278123s" podCreationTimestamp="2026-01-21 14:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:46:56.768757173 +0000 UTC m=+1054.677497175" watchObservedRunningTime="2026-01-21 14:46:56.774278123 +0000 UTC m=+1054.683018095" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.268049 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.270185 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.273028 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.289392 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.364831 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.364908 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.466197 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.466595 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.467133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.484525 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"root-account-create-update-zp68q\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.586614 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.757249 4720 generic.go:334] "Generic (PLEG): container finished" podID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerID="3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0" exitCode=0 Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.757963 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerDied","Data":"3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0"} Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.766435 4720 generic.go:334] "Generic (PLEG): container finished" podID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerID="7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044" exitCode=0 Jan 21 14:46:57 crc kubenswrapper[4720]: I0121 14:46:57.766485 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerDied","Data":"7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044"} Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.047868 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.149271 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.587880 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.696229 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778261 4720 generic.go:334] "Generic (PLEG): container finished" podID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerID="cc3e9052ef84997a09ae1c29fb5eed4fd4dc22153bc67325317d7b50498a93b9" exitCode=0 Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778732 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zp68q" event={"ID":"90833a99-00de-45a6-a7c1-4357c6b5f36d","Type":"ContainerDied","Data":"cc3e9052ef84997a09ae1c29fb5eed4fd4dc22153bc67325317d7b50498a93b9"} Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778761 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zp68q" event={"ID":"90833a99-00de-45a6-a7c1-4357c6b5f36d","Type":"ContainerStarted","Data":"15f5f2ce53ca4b0e7dd67931d369536ff3991ac75a2ca12f6fce3d89de5a93f4"} Jan 21 14:46:58 crc kubenswrapper[4720]: I0121 14:46:58.778959 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" containerID="cri-o://776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" gracePeriod=10 Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.230895 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.231411 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.362447 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413093 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") pod \"0d0385ad-a123-4c46-a96f-652dee1f89cd\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413154 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") pod \"0d0385ad-a123-4c46-a96f-652dee1f89cd\" (UID: \"0d0385ad-a123-4c46-a96f-652dee1f89cd\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413388 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") pod \"b4bb55ed-9214-4f25-8740-ac50421baa4b\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.413416 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") pod \"b4bb55ed-9214-4f25-8740-ac50421baa4b\" (UID: \"b4bb55ed-9214-4f25-8740-ac50421baa4b\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.414083 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b4bb55ed-9214-4f25-8740-ac50421baa4b" (UID: "b4bb55ed-9214-4f25-8740-ac50421baa4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.414426 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0d0385ad-a123-4c46-a96f-652dee1f89cd" (UID: "0d0385ad-a123-4c46-a96f-652dee1f89cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.429696 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268" (OuterVolumeSpecName: "kube-api-access-q8268") pod "0d0385ad-a123-4c46-a96f-652dee1f89cd" (UID: "0d0385ad-a123-4c46-a96f-652dee1f89cd"). InnerVolumeSpecName "kube-api-access-q8268". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.443985 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h" (OuterVolumeSpecName: "kube-api-access-2j27h") pod "b4bb55ed-9214-4f25-8740-ac50421baa4b" (UID: "b4bb55ed-9214-4f25-8740-ac50421baa4b"). InnerVolumeSpecName "kube-api-access-2j27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514767 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514840 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514875 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.514921 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") pod \"92a2976e-a745-4fc4-ae87-355cf6defe5e\" (UID: \"92a2976e-a745-4fc4-ae87-355cf6defe5e\") " Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515422 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0d0385ad-a123-4c46-a96f-652dee1f89cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515437 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8268\" (UniqueName: \"kubernetes.io/projected/0d0385ad-a123-4c46-a96f-652dee1f89cd-kube-api-access-q8268\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515451 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j27h\" (UniqueName: \"kubernetes.io/projected/b4bb55ed-9214-4f25-8740-ac50421baa4b-kube-api-access-2j27h\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.515463 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b4bb55ed-9214-4f25-8740-ac50421baa4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.527453 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg" (OuterVolumeSpecName: "kube-api-access-99xqg") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "kube-api-access-99xqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.553113 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.557262 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.567582 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config" (OuterVolumeSpecName: "config") pod "92a2976e-a745-4fc4-ae87-355cf6defe5e" (UID: "92a2976e-a745-4fc4-ae87-355cf6defe5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617430 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617473 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617485 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99xqg\" (UniqueName: \"kubernetes.io/projected/92a2976e-a745-4fc4-ae87-355cf6defe5e-kube-api-access-99xqg\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.617497 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92a2976e-a745-4fc4-ae87-355cf6defe5e-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.788596 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ckgkh" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.788600 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ckgkh" event={"ID":"0d0385ad-a123-4c46-a96f-652dee1f89cd","Type":"ContainerDied","Data":"bcd20e7f6ac82b5dcffae3b544849f7a8247b45941be3c501277dff7c8746f63"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.788672 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd20e7f6ac82b5dcffae3b544849f7a8247b45941be3c501277dff7c8746f63" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.790735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-372a-account-create-update-w4xkf" event={"ID":"b4bb55ed-9214-4f25-8740-ac50421baa4b","Type":"ContainerDied","Data":"50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.790762 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f6fc698ccd23a99b1eca76dd8ba0bacc12fceffb9d64c77d0dc600a742ad9b" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.790808 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-372a-account-create-update-w4xkf" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.793721 4720 generic.go:334] "Generic (PLEG): container finished" podID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" exitCode=0 Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.793898 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.793903 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerDied","Data":"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.794114 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-4xc4n" event={"ID":"92a2976e-a745-4fc4-ae87-355cf6defe5e","Type":"ContainerDied","Data":"5c2c6c7764adf382494a9f5fec6ff894d74ac4a2178d6e2c114761cba32aec98"} Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.794137 4720 scope.go:117] "RemoveContainer" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.830724 4720 scope.go:117] "RemoveContainer" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.845500 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.845941 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerName="mariadb-account-create-update" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.845959 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerName="mariadb-account-create-update" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.845998 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846006 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.846022 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="init" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846030 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="init" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.846048 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerName="mariadb-database-create" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846056 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerName="mariadb-database-create" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846287 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" containerName="mariadb-database-create" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846299 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" containerName="mariadb-account-create-update" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846314 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" containerName="dnsmasq-dns" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.846840 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.852741 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.862137 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-4xc4n"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.862822 4720 scope.go:117] "RemoveContainer" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.865404 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196\": container with ID starting with 776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196 not found: ID does not exist" containerID="776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.865523 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196"} err="failed to get container status \"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196\": rpc error: code = NotFound desc = could not find container \"776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196\": container with ID starting with 776dc9c8416f9801617b3484ca9e8cd1cc62e694f35849ce17876a8aa7e06196 not found: ID does not exist" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.865613 4720 scope.go:117] "RemoveContainer" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" Jan 21 14:46:59 crc kubenswrapper[4720]: E0121 14:46:59.865995 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906\": container with ID starting with 6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906 not found: ID does not exist" containerID="6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.866017 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906"} err="failed to get container status \"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906\": rpc error: code = NotFound desc = could not find container \"6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906\": container with ID starting with 6f7786d2264d71ca7c2f10284d3bfafeed9e1702b1b9a1588ef1bbc0db72c906 not found: ID does not exist" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.872822 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.956647 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.957568 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.966106 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 21 14:46:59 crc kubenswrapper[4720]: I0121 14:46:59.968838 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.023919 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.024153 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.099389 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.125887 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.126208 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.126866 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.126929 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.127179 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.142026 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:47:00 crc kubenswrapper[4720]: E0121 14:47:00.142387 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerName="mariadb-account-create-update" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.142404 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerName="mariadb-account-create-update" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.142784 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" containerName="mariadb-account-create-update" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.143284 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.149985 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.157992 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"keystone-db-create-mcz8g\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.173225 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.228912 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") pod \"90833a99-00de-45a6-a7c1-4357c6b5f36d\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229165 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") pod \"90833a99-00de-45a6-a7c1-4357c6b5f36d\" (UID: \"90833a99-00de-45a6-a7c1-4357c6b5f36d\") " Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229421 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229521 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229560 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.229602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.231456 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.232542 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90833a99-00de-45a6-a7c1-4357c6b5f36d" (UID: "90833a99-00de-45a6-a7c1-4357c6b5f36d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.239429 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5" (OuterVolumeSpecName: "kube-api-access-7t5s5") pod "90833a99-00de-45a6-a7c1-4357c6b5f36d" (UID: "90833a99-00de-45a6-a7c1-4357c6b5f36d"). InnerVolumeSpecName "kube-api-access-7t5s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.262826 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"keystone-06a3-account-create-update-dbk66\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.290928 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332474 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332541 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90833a99-00de-45a6-a7c1-4357c6b5f36d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.332553 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t5s5\" (UniqueName: \"kubernetes.io/projected/90833a99-00de-45a6-a7c1-4357c6b5f36d-kube-api-access-7t5s5\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.334335 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.353710 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"placement-db-create-55njq\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.353759 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.354758 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.359312 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.366960 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.434456 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.434642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.536013 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.536097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.536907 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.563151 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"placement-318a-account-create-update-lkf6p\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.611102 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.612167 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.616260 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dfmqw" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.618168 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.627935 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.646790 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.682274 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.697232 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92a2976e-a745-4fc4-ae87-355cf6defe5e" path="/var/lib/kubelet/pods/92a2976e-a745-4fc4-ae87-355cf6defe5e/volumes" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.698297 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:47:00 crc kubenswrapper[4720]: W0121 14:47:00.700594 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod290dffa3_ed33_4571_aeb1_092aae1d8105.slice/crio-1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3 WatchSource:0}: Error finding container 1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3: Status 404 returned error can't find the container with id 1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3 Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748290 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748606 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748735 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.748770 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.811350 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mcz8g" event={"ID":"290dffa3-ed33-4571-aeb1-092aae1d8105","Type":"ContainerStarted","Data":"1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3"} Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.814102 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zp68q" event={"ID":"90833a99-00de-45a6-a7c1-4357c6b5f36d","Type":"ContainerDied","Data":"15f5f2ce53ca4b0e7dd67931d369536ff3991ac75a2ca12f6fce3d89de5a93f4"} Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.814120 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15f5f2ce53ca4b0e7dd67931d369536ff3991ac75a2ca12f6fce3d89de5a93f4" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.814184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zp68q" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851105 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851154 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851273 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.851322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.856634 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.858467 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.861920 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.859705 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.868923 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"glance-db-sync-dtj5w\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:00 crc kubenswrapper[4720]: I0121 14:47:00.936011 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.089995 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:47:01 crc kubenswrapper[4720]: W0121 14:47:01.090959 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49f0b95b_6621_43fe_93c2_d4e7704f1f61.slice/crio-3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06 WatchSource:0}: Error finding container 3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06: Status 404 returned error can't find the container with id 3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06 Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.190388 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.555813 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.824894 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerStarted","Data":"dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.824943 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerStarted","Data":"2109b20b6f854145c58fbae0d14383fbd564d133a8afbb9af7549a06fd795e90"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.826183 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerStarted","Data":"cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.827934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerStarted","Data":"41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.827957 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerStarted","Data":"49db172c874956edacb1f12fe4161e695fa0df2db97f11fd7c0e9120811d6732"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.830044 4720 generic.go:334] "Generic (PLEG): container finished" podID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerID="93fd560224a5890696cb0b97a0caeb546a3a0f6e334fb8c0f1cfda08ff3cdbe7" exitCode=0 Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.830086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mcz8g" event={"ID":"290dffa3-ed33-4571-aeb1-092aae1d8105","Type":"ContainerDied","Data":"93fd560224a5890696cb0b97a0caeb546a3a0f6e334fb8c0f1cfda08ff3cdbe7"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.831133 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-55njq" event={"ID":"49f0b95b-6621-43fe-93c2-d4e7704f1f61","Type":"ContainerStarted","Data":"3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06"} Jan 21 14:47:01 crc kubenswrapper[4720]: I0121 14:47:01.847880 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-06a3-account-create-update-dbk66" podStartSLOduration=2.847851354 podStartE2EDuration="2.847851354s" podCreationTimestamp="2026-01-21 14:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:01.846674391 +0000 UTC m=+1059.755414333" watchObservedRunningTime="2026-01-21 14:47:01.847851354 +0000 UTC m=+1059.756591296" Jan 21 14:47:02 crc kubenswrapper[4720]: I0121 14:47:02.839378 4720 generic.go:334] "Generic (PLEG): container finished" podID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerID="abbb759ffaf221d0c9f8ed807f7987c4931c0626f086cc661e603dcc248f4947" exitCode=0 Jan 21 14:47:02 crc kubenswrapper[4720]: I0121 14:47:02.839478 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-55njq" event={"ID":"49f0b95b-6621-43fe-93c2-d4e7704f1f61","Type":"ContainerDied","Data":"abbb759ffaf221d0c9f8ed807f7987c4931c0626f086cc661e603dcc248f4947"} Jan 21 14:47:02 crc kubenswrapper[4720]: I0121 14:47:02.895907 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-318a-account-create-update-lkf6p" podStartSLOduration=2.895888711 podStartE2EDuration="2.895888711s" podCreationTimestamp="2026-01-21 14:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:02.892450307 +0000 UTC m=+1060.801190239" watchObservedRunningTime="2026-01-21 14:47:02.895888711 +0000 UTC m=+1060.804628653" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.152935 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.291039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") pod \"290dffa3-ed33-4571-aeb1-092aae1d8105\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.291115 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") pod \"290dffa3-ed33-4571-aeb1-092aae1d8105\" (UID: \"290dffa3-ed33-4571-aeb1-092aae1d8105\") " Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.292064 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "290dffa3-ed33-4571-aeb1-092aae1d8105" (UID: "290dffa3-ed33-4571-aeb1-092aae1d8105"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.297363 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh" (OuterVolumeSpecName: "kube-api-access-8bnwh") pod "290dffa3-ed33-4571-aeb1-092aae1d8105" (UID: "290dffa3-ed33-4571-aeb1-092aae1d8105"). InnerVolumeSpecName "kube-api-access-8bnwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.393317 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bnwh\" (UniqueName: \"kubernetes.io/projected/290dffa3-ed33-4571-aeb1-092aae1d8105-kube-api-access-8bnwh\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.393339 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290dffa3-ed33-4571-aeb1-092aae1d8105-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.556401 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.565528 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zp68q"] Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.856290 4720 generic.go:334] "Generic (PLEG): container finished" podID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerID="41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9" exitCode=0 Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.856482 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerDied","Data":"41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9"} Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.861213 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mcz8g" event={"ID":"290dffa3-ed33-4571-aeb1-092aae1d8105","Type":"ContainerDied","Data":"1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3"} Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.861256 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0eaed9ebf8e37b7d7113aa66da4e4a8e47a0b85be9622d3fbe34dfdf3209e3" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.861236 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mcz8g" Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.862672 4720 generic.go:334] "Generic (PLEG): container finished" podID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerID="dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181" exitCode=0 Jan 21 14:47:03 crc kubenswrapper[4720]: I0121 14:47:03.862807 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerDied","Data":"dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181"} Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.176396 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309019 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") pod \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309102 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") pod \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\" (UID: \"49f0b95b-6621-43fe-93c2-d4e7704f1f61\") " Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309584 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49f0b95b-6621-43fe-93c2-d4e7704f1f61" (UID: "49f0b95b-6621-43fe-93c2-d4e7704f1f61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.309693 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49f0b95b-6621-43fe-93c2-d4e7704f1f61-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.316209 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs" (OuterVolumeSpecName: "kube-api-access-x55hs") pod "49f0b95b-6621-43fe-93c2-d4e7704f1f61" (UID: "49f0b95b-6621-43fe-93c2-d4e7704f1f61"). InnerVolumeSpecName "kube-api-access-x55hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.412020 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x55hs\" (UniqueName: \"kubernetes.io/projected/49f0b95b-6621-43fe-93c2-d4e7704f1f61-kube-api-access-x55hs\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.687499 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90833a99-00de-45a6-a7c1-4357c6b5f36d" path="/var/lib/kubelet/pods/90833a99-00de-45a6-a7c1-4357c6b5f36d/volumes" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.870369 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-55njq" Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.870966 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-55njq" event={"ID":"49f0b95b-6621-43fe-93c2-d4e7704f1f61","Type":"ContainerDied","Data":"3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06"} Jan 21 14:47:04 crc kubenswrapper[4720]: I0121 14:47:04.870995 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f52311ac4be9191b50d313b0610f5f223eaf86ba39fcbe30e3eaa05a9f82d06" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.074323 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.276971 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.340227 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.430909 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") pod \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.430983 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") pod \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.431079 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") pod \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\" (UID: \"a4fbe0fa-0158-480f-9f6d-2d589da3b91e\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.431096 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") pod \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\" (UID: \"8161ded5-d8ab-48b7-9c1a-16a7155641d1\") " Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.432113 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4fbe0fa-0158-480f-9f6d-2d589da3b91e" (UID: "a4fbe0fa-0158-480f-9f6d-2d589da3b91e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.432174 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8161ded5-d8ab-48b7-9c1a-16a7155641d1" (UID: "8161ded5-d8ab-48b7-9c1a-16a7155641d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.436875 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d" (OuterVolumeSpecName: "kube-api-access-llz2d") pod "8161ded5-d8ab-48b7-9c1a-16a7155641d1" (UID: "8161ded5-d8ab-48b7-9c1a-16a7155641d1"). InnerVolumeSpecName "kube-api-access-llz2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.447128 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn" (OuterVolumeSpecName: "kube-api-access-fggzn") pod "a4fbe0fa-0158-480f-9f6d-2d589da3b91e" (UID: "a4fbe0fa-0158-480f-9f6d-2d589da3b91e"). InnerVolumeSpecName "kube-api-access-fggzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532913 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532951 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fggzn\" (UniqueName: \"kubernetes.io/projected/a4fbe0fa-0158-480f-9f6d-2d589da3b91e-kube-api-access-fggzn\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532966 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8161ded5-d8ab-48b7-9c1a-16a7155641d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:05 crc kubenswrapper[4720]: I0121 14:47:05.532997 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llz2d\" (UniqueName: \"kubernetes.io/projected/8161ded5-d8ab-48b7-9c1a-16a7155641d1-kube-api-access-llz2d\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.247639 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-06a3-account-create-update-dbk66" event={"ID":"8161ded5-d8ab-48b7-9c1a-16a7155641d1","Type":"ContainerDied","Data":"49db172c874956edacb1f12fe4161e695fa0df2db97f11fd7c0e9120811d6732"} Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.247715 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49db172c874956edacb1f12fe4161e695fa0df2db97f11fd7c0e9120811d6732" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.247782 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-06a3-account-create-update-dbk66" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.269266 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-318a-account-create-update-lkf6p" event={"ID":"a4fbe0fa-0158-480f-9f6d-2d589da3b91e","Type":"ContainerDied","Data":"2109b20b6f854145c58fbae0d14383fbd564d133a8afbb9af7549a06fd795e90"} Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.269311 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-318a-account-create-update-lkf6p" Jan 21 14:47:06 crc kubenswrapper[4720]: I0121 14:47:06.269321 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2109b20b6f854145c58fbae0d14383fbd564d133a8afbb9af7549a06fd795e90" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.551574 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552272 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552285 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552296 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552303 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552330 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552338 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: E0121 14:47:08.552352 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552359 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552498 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552516 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552533 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" containerName="mariadb-database-create" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.552542 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" containerName="mariadb-account-create-update" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.553089 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.555706 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.568096 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.683179 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.683335 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.784273 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.784500 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.788003 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.835208 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"root-account-create-update-89dxv\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:08 crc kubenswrapper[4720]: I0121 14:47:08.870069 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.347328 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wpvzs" podUID="95379233-3cd8-4dd3-bf0f-b8198f2258e1" containerName="ovn-controller" probeResult="failure" output=< Jan 21 14:47:10 crc kubenswrapper[4720]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 14:47:10 crc kubenswrapper[4720]: > Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.396765 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.402218 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-2v7f2" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.726306 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.727299 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.729784 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.750308 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814398 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814493 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814541 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814606 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814625 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.814711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916503 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916592 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916627 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916697 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.916712 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917365 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917389 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917441 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.917454 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.918930 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:10 crc kubenswrapper[4720]: I0121 14:47:10.942345 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"ovn-controller-wpvzs-config-5vhqv\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.055034 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.309804 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerID="c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea" exitCode=0 Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.309854 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerDied","Data":"c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea"} Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.312098 4720 generic.go:334] "Generic (PLEG): container finished" podID="3a2eafda-c352-4311-94d5-a1aec1422699" containerID="c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed" exitCode=0 Jan 21 14:47:11 crc kubenswrapper[4720]: I0121 14:47:11.312787 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerDied","Data":"c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed"} Jan 21 14:47:15 crc kubenswrapper[4720]: I0121 14:47:15.337636 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wpvzs" podUID="95379233-3cd8-4dd3-bf0f-b8198f2258e1" containerName="ovn-controller" probeResult="failure" output=< Jan 21 14:47:15 crc kubenswrapper[4720]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 21 14:47:15 crc kubenswrapper[4720]: > Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.375680 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.376429 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6m45w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-dtj5w_openstack(c40c650e-a05e-4cc0-88fa-d56eae92d29a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.378475 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-dtj5w" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" Jan 21 14:47:18 crc kubenswrapper[4720]: E0121 14:47:18.412782 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-dtj5w" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" Jan 21 14:47:18 crc kubenswrapper[4720]: I0121 14:47:18.733038 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:47:18 crc kubenswrapper[4720]: I0121 14:47:18.887716 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:18 crc kubenswrapper[4720]: W0121 14:47:18.887814 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod049894b0_0575_4fc0_bbca_f75722e173af.slice/crio-388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c WatchSource:0}: Error finding container 388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c: Status 404 returned error can't find the container with id 388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.411692 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerStarted","Data":"41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.412101 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.413088 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerStarted","Data":"7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.413114 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerStarted","Data":"388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.414602 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerStarted","Data":"bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.414634 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerStarted","Data":"7f4155d41737ffdf00447905247811a40e9ef38968723dae87057b3c2f1c49f8"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.417086 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerStarted","Data":"9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e"} Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.417499 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.462684 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=76.829157744 podStartE2EDuration="1m24.462669615s" podCreationTimestamp="2026-01-21 14:45:55 +0000 UTC" firstStartedPulling="2026-01-21 14:46:30.32224961 +0000 UTC m=+1028.230989542" lastFinishedPulling="2026-01-21 14:46:37.955761481 +0000 UTC m=+1035.864501413" observedRunningTime="2026-01-21 14:47:19.436041519 +0000 UTC m=+1077.344781461" watchObservedRunningTime="2026-01-21 14:47:19.462669615 +0000 UTC m=+1077.371409557" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.465543 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=74.599967347 podStartE2EDuration="1m23.465531223s" podCreationTimestamp="2026-01-21 14:45:56 +0000 UTC" firstStartedPulling="2026-01-21 14:46:28.577809154 +0000 UTC m=+1026.486549086" lastFinishedPulling="2026-01-21 14:46:37.44337303 +0000 UTC m=+1035.352112962" observedRunningTime="2026-01-21 14:47:19.457188516 +0000 UTC m=+1077.365928468" watchObservedRunningTime="2026-01-21 14:47:19.465531223 +0000 UTC m=+1077.374271155" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.478549 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wpvzs-config-5vhqv" podStartSLOduration=9.478532977 podStartE2EDuration="9.478532977s" podCreationTimestamp="2026-01-21 14:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:19.47423851 +0000 UTC m=+1077.382978452" watchObservedRunningTime="2026-01-21 14:47:19.478532977 +0000 UTC m=+1077.387272909" Jan 21 14:47:19 crc kubenswrapper[4720]: I0121 14:47:19.489491 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-89dxv" podStartSLOduration=11.489476596 podStartE2EDuration="11.489476596s" podCreationTimestamp="2026-01-21 14:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:19.487883283 +0000 UTC m=+1077.396623235" watchObservedRunningTime="2026-01-21 14:47:19.489476596 +0000 UTC m=+1077.398216528" Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.370004 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wpvzs" Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.439258 4720 generic.go:334] "Generic (PLEG): container finished" podID="049894b0-0575-4fc0-bbca-f75722e173af" containerID="7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687" exitCode=0 Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.439352 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerDied","Data":"7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687"} Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.447940 4720 generic.go:334] "Generic (PLEG): container finished" podID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerID="bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a" exitCode=0 Jan 21 14:47:20 crc kubenswrapper[4720]: I0121 14:47:20.449043 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerDied","Data":"bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a"} Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.817692 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.902963 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.938455 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") pod \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.938583 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") pod \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\" (UID: \"1fc2d647-37b6-4437-98fc-1d95af05cfe0\") " Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.939351 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1fc2d647-37b6-4437-98fc-1d95af05cfe0" (UID: "1fc2d647-37b6-4437-98fc-1d95af05cfe0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:21 crc kubenswrapper[4720]: I0121 14:47:21.953511 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5" (OuterVolumeSpecName: "kube-api-access-z5lc5") pod "1fc2d647-37b6-4437-98fc-1d95af05cfe0" (UID: "1fc2d647-37b6-4437-98fc-1d95af05cfe0"). InnerVolumeSpecName "kube-api-access-z5lc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040320 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040435 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040464 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040518 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run" (OuterVolumeSpecName: "var-run") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040557 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.040997 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041036 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") pod \"049894b0-0575-4fc0-bbca-f75722e173af\" (UID: \"049894b0-0575-4fc0-bbca-f75722e173af\") " Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041040 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041095 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041155 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041349 4720 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041364 4720 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041375 4720 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041388 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5lc5\" (UniqueName: \"kubernetes.io/projected/1fc2d647-37b6-4437-98fc-1d95af05cfe0-kube-api-access-z5lc5\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041399 4720 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/049894b0-0575-4fc0-bbca-f75722e173af-var-run\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041410 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc2d647-37b6-4437-98fc-1d95af05cfe0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.041456 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts" (OuterVolumeSpecName: "scripts") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.046230 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9" (OuterVolumeSpecName: "kube-api-access-gfzq9") pod "049894b0-0575-4fc0-bbca-f75722e173af" (UID: "049894b0-0575-4fc0-bbca-f75722e173af"). InnerVolumeSpecName "kube-api-access-gfzq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.142768 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/049894b0-0575-4fc0-bbca-f75722e173af-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.142798 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfzq9\" (UniqueName: \"kubernetes.io/projected/049894b0-0575-4fc0-bbca-f75722e173af-kube-api-access-gfzq9\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.462835 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wpvzs-config-5vhqv" event={"ID":"049894b0-0575-4fc0-bbca-f75722e173af","Type":"ContainerDied","Data":"388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c"} Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.462881 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="388ec9bba1e28c15c50ec8b57590f559224ac51494f8e33c74a1a528e3d13c8c" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.462880 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wpvzs-config-5vhqv" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.464410 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-89dxv" event={"ID":"1fc2d647-37b6-4437-98fc-1d95af05cfe0","Type":"ContainerDied","Data":"7f4155d41737ffdf00447905247811a40e9ef38968723dae87057b3c2f1c49f8"} Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.464443 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4155d41737ffdf00447905247811a40e9ef38968723dae87057b3c2f1c49f8" Jan 21 14:47:22 crc kubenswrapper[4720]: I0121 14:47:22.464489 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-89dxv" Jan 21 14:47:23 crc kubenswrapper[4720]: I0121 14:47:23.020741 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:23 crc kubenswrapper[4720]: I0121 14:47:23.029624 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wpvzs-config-5vhqv"] Jan 21 14:47:24 crc kubenswrapper[4720]: I0121 14:47:24.694840 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049894b0-0575-4fc0-bbca-f75722e173af" path="/var/lib/kubelet/pods/049894b0-0575-4fc0-bbca-f75722e173af/volumes" Jan 21 14:47:31 crc kubenswrapper[4720]: I0121 14:47:31.529762 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerStarted","Data":"8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed"} Jan 21 14:47:31 crc kubenswrapper[4720]: I0121 14:47:31.550238 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-dtj5w" podStartSLOduration=2.920606295 podStartE2EDuration="31.550220566s" podCreationTimestamp="2026-01-21 14:47:00 +0000 UTC" firstStartedPulling="2026-01-21 14:47:01.57154684 +0000 UTC m=+1059.480286772" lastFinishedPulling="2026-01-21 14:47:30.201161091 +0000 UTC m=+1088.109901043" observedRunningTime="2026-01-21 14:47:31.545414185 +0000 UTC m=+1089.454154127" watchObservedRunningTime="2026-01-21 14:47:31.550220566 +0000 UTC m=+1089.458960498" Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.447874 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.592231 4720 generic.go:334] "Generic (PLEG): container finished" podID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerID="8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed" exitCode=0 Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.592283 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerDied","Data":"8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed"} Jan 21 14:47:37 crc kubenswrapper[4720]: I0121 14:47:37.871874 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.024714 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:47:38 crc kubenswrapper[4720]: E0121 14:47:38.025316 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerName="mariadb-account-create-update" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025337 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerName="mariadb-account-create-update" Jan 21 14:47:38 crc kubenswrapper[4720]: E0121 14:47:38.025356 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049894b0-0575-4fc0-bbca-f75722e173af" containerName="ovn-config" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025362 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="049894b0-0575-4fc0-bbca-f75722e173af" containerName="ovn-config" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025538 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" containerName="mariadb-account-create-update" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.025559 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="049894b0-0575-4fc0-bbca-f75722e173af" containerName="ovn-config" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.026146 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.040715 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.041809 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.051059 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.053778 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.072704 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.098409 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.099350 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.115679 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153462 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153526 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153550 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153593 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153621 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.153639 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.181113 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.182101 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.190412 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.198181 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254747 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254765 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254795 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254906 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254929 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.254954 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.255608 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.256139 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.256219 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.278733 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"barbican-db-create-pmrgf\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.283020 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"cinder-db-create-qjpx9\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.288397 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"barbican-b1e4-account-create-update-qtmr9\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.300299 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.301245 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304486 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304596 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304947 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.304997 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.314343 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.355911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.355972 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.355999 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.356031 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.356061 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.356793 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.376183 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.377014 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"cinder-43f1-account-create-update-bsqmb\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.382951 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.392737 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.392835 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.393748 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.430125 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457069 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457118 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457142 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457173 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.457211 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.463616 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.464140 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.489896 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"keystone-db-sync-f47pm\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.502107 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.559500 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.559929 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.560539 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.604120 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"neutron-db-create-md2wm\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.614593 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.615620 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.617933 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.638133 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.656299 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.661970 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.662039 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.763088 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.763287 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.764019 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.783526 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"neutron-9105-account-create-update-h4nvp\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.783854 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:38 crc kubenswrapper[4720]: I0121 14:47:38.951174 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.055778 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.061834 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.230612 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281222 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281273 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281380 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.281422 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") pod \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\" (UID: \"c40c650e-a05e-4cc0-88fa-d56eae92d29a\") " Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.282271 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:47:39 crc kubenswrapper[4720]: W0121 14:47:39.309123 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6545ddce_5b65_4702_9dee_2f2d9644123e.slice/crio-528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc WatchSource:0}: Error finding container 528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc: Status 404 returned error can't find the container with id 528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.317367 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.317986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w" (OuterVolumeSpecName: "kube-api-access-6m45w") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "kube-api-access-6m45w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.320721 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.370802 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.384006 4720 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.384031 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m45w\" (UniqueName: \"kubernetes.io/projected/c40c650e-a05e-4cc0-88fa-d56eae92d29a-kube-api-access-6m45w\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.384042 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.488448 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data" (OuterVolumeSpecName: "config-data") pod "c40c650e-a05e-4cc0-88fa-d56eae92d29a" (UID: "c40c650e-a05e-4cc0-88fa-d56eae92d29a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.490708 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c40c650e-a05e-4cc0-88fa-d56eae92d29a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.511243 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.517079 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.523736 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.619721 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerStarted","Data":"44b9f91c267339bfc8b0884fc8c5c2ef790947f87b24723a7a09d88ebc795b96"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.627847 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerStarted","Data":"a8464f3faefc31f591d60cd9bb491ee725d1832097b4b460ab3f30832c1f9786"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.647411 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-dtj5w" event={"ID":"c40c650e-a05e-4cc0-88fa-d56eae92d29a","Type":"ContainerDied","Data":"cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.647452 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.647542 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-dtj5w" Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.664206 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerStarted","Data":"528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.672784 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerStarted","Data":"658d7be68d105dc01f8fc7dea1ad05505da4ae53e78a60889ef3189b0c240565"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.679928 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerStarted","Data":"80e7a592e6e7e74a3b4cfb70f06571bb7d095c2e27cd4d4950ffd10460938212"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.682076 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerStarted","Data":"dfd7ece9b83e3449098bc10a01ae36a3ed9423d9ac3e9f3e6dd2c77855881410"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.683860 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerStarted","Data":"13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.683883 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerStarted","Data":"e1397cb47568b144f9e51ff27d3be72176365c20c47cfef59e309117c01af8c3"} Jan 21 14:47:39 crc kubenswrapper[4720]: I0121 14:47:39.699185 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-qjpx9" podStartSLOduration=2.699166943 podStartE2EDuration="2.699166943s" podCreationTimestamp="2026-01-21 14:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:39.698100624 +0000 UTC m=+1097.606840556" watchObservedRunningTime="2026-01-21 14:47:39.699166943 +0000 UTC m=+1097.607906875" Jan 21 14:47:39 crc kubenswrapper[4720]: E0121 14:47:39.721941 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40c650e_a05e_4cc0_88fa_d56eae92d29a.slice/crio-cf896b0a6c15a078a1d5ea17910a17801f89b537d091ad548be61dbf899f72a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc40c650e_a05e_4cc0_88fa_d56eae92d29a.slice\": RecentStats: unable to find data in memory cache]" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.207761 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:47:40 crc kubenswrapper[4720]: E0121 14:47:40.208469 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerName="glance-db-sync" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.208533 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerName="glance-db-sync" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.208765 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" containerName="glance-db-sync" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.216978 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.244312 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311520 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311627 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311680 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311731 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.311746 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413151 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413489 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413547 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.413630 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.414582 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.415639 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.418231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.419280 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.420354 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.470056 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"dnsmasq-dns-54f9b7b8d9-7dx9d\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.543971 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.695195 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerStarted","Data":"0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.697337 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerStarted","Data":"307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.708261 4720 generic.go:334] "Generic (PLEG): container finished" podID="077e6634-d42f-4765-ab65-9e24cf21a047" containerID="13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f" exitCode=0 Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.708329 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerDied","Data":"13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.710745 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerStarted","Data":"5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.714171 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerStarted","Data":"b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.732252 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerStarted","Data":"f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d"} Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.763823 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b1e4-account-create-update-qtmr9" podStartSLOduration=3.763805322 podStartE2EDuration="3.763805322s" podCreationTimestamp="2026-01-21 14:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.729447896 +0000 UTC m=+1098.638187828" watchObservedRunningTime="2026-01-21 14:47:40.763805322 +0000 UTC m=+1098.672545254" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.795716 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-pmrgf" podStartSLOduration=2.795700792 podStartE2EDuration="2.795700792s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.788454084 +0000 UTC m=+1098.697194016" watchObservedRunningTime="2026-01-21 14:47:40.795700792 +0000 UTC m=+1098.704440724" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.812143 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9105-account-create-update-h4nvp" podStartSLOduration=2.8121297800000002 podStartE2EDuration="2.81212978s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.809050496 +0000 UTC m=+1098.717790428" watchObservedRunningTime="2026-01-21 14:47:40.81212978 +0000 UTC m=+1098.720869712" Jan 21 14:47:40 crc kubenswrapper[4720]: I0121 14:47:40.839626 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-md2wm" podStartSLOduration=2.8396042 podStartE2EDuration="2.8396042s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.82936447 +0000 UTC m=+1098.738104402" watchObservedRunningTime="2026-01-21 14:47:40.8396042 +0000 UTC m=+1098.748344122" Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.182691 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-43f1-account-create-update-bsqmb" podStartSLOduration=3.182641113 podStartE2EDuration="3.182641113s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:40.851881624 +0000 UTC m=+1098.760621556" watchObservedRunningTime="2026-01-21 14:47:41.182641113 +0000 UTC m=+1099.091381045" Jan 21 14:47:41 crc kubenswrapper[4720]: W0121 14:47:41.194844 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7c8b46b_d758_4538_a345_21ccc71aabe4.slice/crio-e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586 WatchSource:0}: Error finding container e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586: Status 404 returned error can't find the container with id e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.211829 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.745353 4720 generic.go:334] "Generic (PLEG): container finished" podID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerID="b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.745434 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerDied","Data":"b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.747127 4720 generic.go:334] "Generic (PLEG): container finished" podID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerID="f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.747170 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerDied","Data":"f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.749384 4720 generic.go:334] "Generic (PLEG): container finished" podID="8da5c3a6-e588-412a-b884-7875fe439e61" containerID="0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.749452 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerDied","Data":"0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.751007 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerID="307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.751062 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerDied","Data":"307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.752384 4720 generic.go:334] "Generic (PLEG): container finished" podID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerID="5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.752431 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerDied","Data":"5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.753849 4720 generic.go:334] "Generic (PLEG): container finished" podID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerID="856c89063d4a8ffd0dea2ed8327f087d5552a6933c06f250049624af9b370e87" exitCode=0 Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.753882 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerDied","Data":"856c89063d4a8ffd0dea2ed8327f087d5552a6933c06f250049624af9b370e87"} Jan 21 14:47:41 crc kubenswrapper[4720]: I0121 14:47:41.753905 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerStarted","Data":"e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586"} Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.160492 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.266752 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") pod \"077e6634-d42f-4765-ab65-9e24cf21a047\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.266873 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") pod \"077e6634-d42f-4765-ab65-9e24cf21a047\" (UID: \"077e6634-d42f-4765-ab65-9e24cf21a047\") " Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.268031 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "077e6634-d42f-4765-ab65-9e24cf21a047" (UID: "077e6634-d42f-4765-ab65-9e24cf21a047"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.273199 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq" (OuterVolumeSpecName: "kube-api-access-tvmlq") pod "077e6634-d42f-4765-ab65-9e24cf21a047" (UID: "077e6634-d42f-4765-ab65-9e24cf21a047"). InnerVolumeSpecName "kube-api-access-tvmlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.368363 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmlq\" (UniqueName: \"kubernetes.io/projected/077e6634-d42f-4765-ab65-9e24cf21a047-kube-api-access-tvmlq\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.368384 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/077e6634-d42f-4765-ab65-9e24cf21a047-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.766019 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerStarted","Data":"e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666"} Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.766149 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.771704 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-qjpx9" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.772405 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-qjpx9" event={"ID":"077e6634-d42f-4765-ab65-9e24cf21a047","Type":"ContainerDied","Data":"e1397cb47568b144f9e51ff27d3be72176365c20c47cfef59e309117c01af8c3"} Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.772433 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1397cb47568b144f9e51ff27d3be72176365c20c47cfef59e309117c01af8c3" Jan 21 14:47:42 crc kubenswrapper[4720]: I0121 14:47:42.801823 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podStartSLOduration=2.801805472 podStartE2EDuration="2.801805472s" podCreationTimestamp="2026-01-21 14:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:42.794631657 +0000 UTC m=+1100.703371589" watchObservedRunningTime="2026-01-21 14:47:42.801805472 +0000 UTC m=+1100.710545404" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.810804 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e4-account-create-update-qtmr9" event={"ID":"8da5c3a6-e588-412a-b884-7875fe439e61","Type":"ContainerDied","Data":"658d7be68d105dc01f8fc7dea1ad05505da4ae53e78a60889ef3189b0c240565"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.811439 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658d7be68d105dc01f8fc7dea1ad05505da4ae53e78a60889ef3189b0c240565" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.812554 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-md2wm" event={"ID":"d3a4204b-d91a-4d30-bea2-c327b452b61a","Type":"ContainerDied","Data":"80e7a592e6e7e74a3b4cfb70f06571bb7d095c2e27cd4d4950ffd10460938212"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.812576 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e7a592e6e7e74a3b4cfb70f06571bb7d095c2e27cd4d4950ffd10460938212" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.814948 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9105-account-create-update-h4nvp" event={"ID":"82f9f1ca-7fe3-4e17-8393-20364149010d","Type":"ContainerDied","Data":"44b9f91c267339bfc8b0884fc8c5c2ef790947f87b24723a7a09d88ebc795b96"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.814975 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44b9f91c267339bfc8b0884fc8c5c2ef790947f87b24723a7a09d88ebc795b96" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.816999 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pmrgf" event={"ID":"5ffa29ff-07bd-40cc-9853-a484f79b382f","Type":"ContainerDied","Data":"a8464f3faefc31f591d60cd9bb491ee725d1832097b4b460ab3f30832c1f9786"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.817037 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8464f3faefc31f591d60cd9bb491ee725d1832097b4b460ab3f30832c1f9786" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.827894 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-43f1-account-create-update-bsqmb" event={"ID":"6545ddce-5b65-4702-9dee-2f2d9644123e","Type":"ContainerDied","Data":"528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc"} Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.827955 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="528bbf4e75418d1c76364b9b2f67132ee68b2d967c83d53f51bfc164400ee8dc" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.922089 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.953643 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.980497 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:46 crc kubenswrapper[4720]: I0121 14:47:46.989184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.010673 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043154 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") pod \"82f9f1ca-7fe3-4e17-8393-20364149010d\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043196 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") pod \"8da5c3a6-e588-412a-b884-7875fe439e61\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043247 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") pod \"82f9f1ca-7fe3-4e17-8393-20364149010d\" (UID: \"82f9f1ca-7fe3-4e17-8393-20364149010d\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043271 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") pod \"d3a4204b-d91a-4d30-bea2-c327b452b61a\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") pod \"d3a4204b-d91a-4d30-bea2-c327b452b61a\" (UID: \"d3a4204b-d91a-4d30-bea2-c327b452b61a\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043349 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") pod \"8da5c3a6-e588-412a-b884-7875fe439e61\" (UID: \"8da5c3a6-e588-412a-b884-7875fe439e61\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043387 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") pod \"5ffa29ff-07bd-40cc-9853-a484f79b382f\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043461 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") pod \"5ffa29ff-07bd-40cc-9853-a484f79b382f\" (UID: \"5ffa29ff-07bd-40cc-9853-a484f79b382f\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043492 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") pod \"6545ddce-5b65-4702-9dee-2f2d9644123e\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.043532 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") pod \"6545ddce-5b65-4702-9dee-2f2d9644123e\" (UID: \"6545ddce-5b65-4702-9dee-2f2d9644123e\") " Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.044992 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3a4204b-d91a-4d30-bea2-c327b452b61a" (UID: "d3a4204b-d91a-4d30-bea2-c327b452b61a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.048437 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8da5c3a6-e588-412a-b884-7875fe439e61" (UID: "8da5c3a6-e588-412a-b884-7875fe439e61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.048869 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ffa29ff-07bd-40cc-9853-a484f79b382f" (UID: "5ffa29ff-07bd-40cc-9853-a484f79b382f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.049294 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82f9f1ca-7fe3-4e17-8393-20364149010d" (UID: "82f9f1ca-7fe3-4e17-8393-20364149010d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.049394 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246" (OuterVolumeSpecName: "kube-api-access-n2246") pod "6545ddce-5b65-4702-9dee-2f2d9644123e" (UID: "6545ddce-5b65-4702-9dee-2f2d9644123e"). InnerVolumeSpecName "kube-api-access-n2246". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.049409 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6545ddce-5b65-4702-9dee-2f2d9644123e" (UID: "6545ddce-5b65-4702-9dee-2f2d9644123e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.051477 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt" (OuterVolumeSpecName: "kube-api-access-fqgdt") pod "82f9f1ca-7fe3-4e17-8393-20364149010d" (UID: "82f9f1ca-7fe3-4e17-8393-20364149010d"). InnerVolumeSpecName "kube-api-access-fqgdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.051920 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk" (OuterVolumeSpecName: "kube-api-access-8j6rk") pod "8da5c3a6-e588-412a-b884-7875fe439e61" (UID: "8da5c3a6-e588-412a-b884-7875fe439e61"). InnerVolumeSpecName "kube-api-access-8j6rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.052401 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs" (OuterVolumeSpecName: "kube-api-access-2tvcs") pod "d3a4204b-d91a-4d30-bea2-c327b452b61a" (UID: "d3a4204b-d91a-4d30-bea2-c327b452b61a"). InnerVolumeSpecName "kube-api-access-2tvcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.052844 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm" (OuterVolumeSpecName: "kube-api-access-7lmjm") pod "5ffa29ff-07bd-40cc-9853-a484f79b382f" (UID: "5ffa29ff-07bd-40cc-9853-a484f79b382f"). InnerVolumeSpecName "kube-api-access-7lmjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144770 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ffa29ff-07bd-40cc-9853-a484f79b382f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144808 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6545ddce-5b65-4702-9dee-2f2d9644123e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144844 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2246\" (UniqueName: \"kubernetes.io/projected/6545ddce-5b65-4702-9dee-2f2d9644123e-kube-api-access-n2246\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144861 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqgdt\" (UniqueName: \"kubernetes.io/projected/82f9f1ca-7fe3-4e17-8393-20364149010d-kube-api-access-fqgdt\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144875 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j6rk\" (UniqueName: \"kubernetes.io/projected/8da5c3a6-e588-412a-b884-7875fe439e61-kube-api-access-8j6rk\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144886 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82f9f1ca-7fe3-4e17-8393-20364149010d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144921 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvcs\" (UniqueName: \"kubernetes.io/projected/d3a4204b-d91a-4d30-bea2-c327b452b61a-kube-api-access-2tvcs\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144934 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3a4204b-d91a-4d30-bea2-c327b452b61a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144944 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8da5c3a6-e588-412a-b884-7875fe439e61-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.144956 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lmjm\" (UniqueName: \"kubernetes.io/projected/5ffa29ff-07bd-40cc-9853-a484f79b382f-kube-api-access-7lmjm\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839105 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-43f1-account-create-update-bsqmb" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839127 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-md2wm" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839147 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerStarted","Data":"77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c"} Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839154 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9105-account-create-update-h4nvp" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839181 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pmrgf" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.839189 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e4-account-create-update-qtmr9" Jan 21 14:47:47 crc kubenswrapper[4720]: I0121 14:47:47.953404 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-f47pm" podStartSLOduration=2.662609213 podStartE2EDuration="9.953379589s" podCreationTimestamp="2026-01-21 14:47:38 +0000 UTC" firstStartedPulling="2026-01-21 14:47:39.477377456 +0000 UTC m=+1097.386117388" lastFinishedPulling="2026-01-21 14:47:46.768147822 +0000 UTC m=+1104.676887764" observedRunningTime="2026-01-21 14:47:47.862578275 +0000 UTC m=+1105.771318227" watchObservedRunningTime="2026-01-21 14:47:47.953379589 +0000 UTC m=+1105.862119531" Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.546852 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.633128 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.633631 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" containerID="cri-o://2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc" gracePeriod=10 Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.865257 4720 generic.go:334] "Generic (PLEG): container finished" podID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerID="2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc" exitCode=0 Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.865455 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerDied","Data":"2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc"} Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.868323 4720 generic.go:334] "Generic (PLEG): container finished" podID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerID="77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c" exitCode=0 Jan 21 14:47:50 crc kubenswrapper[4720]: I0121 14:47:50.868353 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerDied","Data":"77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c"} Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.069945 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227813 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227861 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227902 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.227995 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.228026 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") pod \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\" (UID: \"c819d03b-78e1-470e-96dc-6144aa8e8f5a\") " Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.240855 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8" (OuterVolumeSpecName: "kube-api-access-w9bw8") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "kube-api-access-w9bw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.284751 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config" (OuterVolumeSpecName: "config") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.285050 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.286510 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333403 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333475 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9bw8\" (UniqueName: \"kubernetes.io/projected/c819d03b-78e1-470e-96dc-6144aa8e8f5a-kube-api-access-w9bw8\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333493 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.333542 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.341419 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c819d03b-78e1-470e-96dc-6144aa8e8f5a" (UID: "c819d03b-78e1-470e-96dc-6144aa8e8f5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.435014 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c819d03b-78e1-470e-96dc-6144aa8e8f5a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.880681 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" event={"ID":"c819d03b-78e1-470e-96dc-6144aa8e8f5a","Type":"ContainerDied","Data":"796d2687528fd87d25dd4fc1a5f89808d76b284cc0b9360ef63068e7663548e8"} Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.880738 4720 scope.go:117] "RemoveContainer" containerID="2833ba66e8cdea183ed0eb3bc5ec831775fbf6f4d81eeb54b6a98664af721cfc" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.882271 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-r6bj4" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.934210 4720 scope.go:117] "RemoveContainer" containerID="ce16ebb9a67a679cad4040701c2e535eabfd75f649979c91f4ea8e8bc1b64f6b" Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.943858 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:47:51 crc kubenswrapper[4720]: I0121 14:47:51.952528 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-r6bj4"] Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.238095 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.350640 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") pod \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.350722 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") pod \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.350905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") pod \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\" (UID: \"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17\") " Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.356993 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt" (OuterVolumeSpecName: "kube-api-access-fnhtt") pod "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" (UID: "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17"). InnerVolumeSpecName "kube-api-access-fnhtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.378165 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" (UID: "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.405290 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data" (OuterVolumeSpecName: "config-data") pod "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" (UID: "3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.452892 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnhtt\" (UniqueName: \"kubernetes.io/projected/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-kube-api-access-fnhtt\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.452950 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.452964 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.689596 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" path="/var/lib/kubelet/pods/c819d03b-78e1-470e-96dc-6144aa8e8f5a/volumes" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.891839 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-f47pm" event={"ID":"3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17","Type":"ContainerDied","Data":"dfd7ece9b83e3449098bc10a01ae36a3ed9423d9ac3e9f3e6dd2c77855881410"} Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.891895 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd7ece9b83e3449098bc10a01ae36a3ed9423d9ac3e9f3e6dd2c77855881410" Jan 21 14:47:52 crc kubenswrapper[4720]: I0121 14:47:52.891972 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-f47pm" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.146800 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147348 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerName="keystone-db-sync" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147366 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerName="keystone-db-sync" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147392 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147399 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147412 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147418 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147427 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147432 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147442 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147447 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147457 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147462 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147471 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="init" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147476 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="init" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147489 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147495 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: E0121 14:47:53.147507 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147513 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.147639 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" containerName="keystone-db-sync" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149736 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149768 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c819d03b-78e1-470e-96dc-6144aa8e8f5a" containerName="dnsmasq-dns" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149779 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149792 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149801 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" containerName="mariadb-database-create" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149811 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.149833 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" containerName="mariadb-account-create-update" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.150793 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.166012 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170644 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170766 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170787 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.170871 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.218339 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.220698 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229165 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229390 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229500 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229616 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.229762 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.253268 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272396 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272442 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272478 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272500 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272525 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272545 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272595 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272776 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272826 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.272888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.273643 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.273832 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.275735 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.276268 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.329307 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"dnsmasq-dns-6546db6db7-swpsq\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373548 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373602 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373632 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373667 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373682 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.373721 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.381496 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.384861 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.385115 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.390996 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.394028 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.399274 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"keystone-bootstrap-sbng7\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.466265 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.501380 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.509020 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.516430 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.519134 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.555721 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.559482 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.568307 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.569268 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.579012 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9pjf9" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.590067 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.637573 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.676951 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.678617 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680695 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680737 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680756 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680805 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680824 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680851 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680877 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680910 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680929 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.680944 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.684140 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.684321 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.684521 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-r779p" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.692523 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.722174 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.723060 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.726774 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r7487" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.726960 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.728247 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.761714 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783562 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783597 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783628 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783671 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783701 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783724 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783744 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783759 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783782 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783804 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783826 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783844 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.783862 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.789276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.789733 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.790140 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.799281 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.800377 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.809409 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.813017 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.813870 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.816182 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"barbican-db-sync-wtr5d\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.821042 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.841344 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.852019 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.855698 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"ceilometer-0\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.904835 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918552 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918621 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918726 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918780 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918812 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918865 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.918885 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.927688 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.929281 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.940250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.954300 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.960536 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"neutron-db-sync-fhvrr\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.974213 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.976089 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.978044 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.979846 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 14:47:53 crc kubenswrapper[4720]: I0121 14:47:53.980531 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7rq7c" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.010302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042033 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042173 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042268 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042285 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042347 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042387 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.042595 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051806 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051858 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051891 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051965 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.051992 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052065 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052101 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052128 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.052192 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.055612 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.055854 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.058200 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.058693 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.068913 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.071168 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.100975 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"cinder-db-sync-vz5k2\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154117 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154160 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154199 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154219 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154237 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154256 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154272 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154345 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.154385 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.157283 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.157597 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.158667 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.158897 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.159161 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.162961 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.163184 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.163597 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.181232 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"placement-db-sync-fh44q\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.187235 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"dnsmasq-dns-7987f74bbc-fw4gh\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.352864 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.358066 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.362040 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:54 crc kubenswrapper[4720]: W0121 14:47:54.371170 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfe269e5_4b5b_4c08_acd0_2a2218d121f9.slice/crio-9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca WatchSource:0}: Error finding container 9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca: Status 404 returned error can't find the container with id 9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.377013 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.620946 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.661287 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:54 crc kubenswrapper[4720]: W0121 14:47:54.684478 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb266c3f_6b50_4953_9f9f_9b41bfc3c4c2.slice/crio-97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846 WatchSource:0}: Error finding container 97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846: Status 404 returned error can't find the container with id 97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846 Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.829328 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:47:54 crc kubenswrapper[4720]: I0121 14:47:54.860993 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.001406 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.002286 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerStarted","Data":"1122f0765aa0595857830e6b23a4de080fa5792bca4d2fa6099c5367b12ae450"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.003592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerStarted","Data":"2b7bc7ab0da041d56c2df3e5b44877c1c08f4a7afb2d85c0210b22ee47e43e82"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.005422 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerStarted","Data":"8e1d6ab9eaf9f02c816601f4cd4ee4cc6e910d8f782825a76f18664eab273126"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.009873 4720 generic.go:334] "Generic (PLEG): container finished" podID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerID="8d7b3178a9503cf3c70be6d2af65ac2abae3603160af505b1963e47f85c4e8ae" exitCode=0 Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.009900 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" event={"ID":"cfe269e5-4b5b-4c08-acd0-2a2218d121f9","Type":"ContainerDied","Data":"8d7b3178a9503cf3c70be6d2af65ac2abae3603160af505b1963e47f85c4e8ae"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.009915 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" event={"ID":"cfe269e5-4b5b-4c08-acd0-2a2218d121f9","Type":"ContainerStarted","Data":"9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca"} Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.053218 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.071675 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.188923 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.488111 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.677959 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688617 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688684 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688807 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688836 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.688866 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") pod \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\" (UID: \"cfe269e5-4b5b-4c08-acd0-2a2218d121f9\") " Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.703111 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd" (OuterVolumeSpecName: "kube-api-access-5lpjd") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "kube-api-access-5lpjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.733916 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.740463 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.754774 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.783340 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config" (OuterVolumeSpecName: "config") pod "cfe269e5-4b5b-4c08-acd0-2a2218d121f9" (UID: "cfe269e5-4b5b-4c08-acd0-2a2218d121f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792339 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792461 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lpjd\" (UniqueName: \"kubernetes.io/projected/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-kube-api-access-5lpjd\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792534 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792720 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:55 crc kubenswrapper[4720]: I0121 14:47:55.792797 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfe269e5-4b5b-4c08-acd0-2a2218d121f9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.018575 4720 generic.go:334] "Generic (PLEG): container finished" podID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" exitCode=0 Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.018641 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerDied","Data":"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.018700 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerStarted","Data":"f174f6fdc28ef67dfde4e4bef9624d4a07461dcbbd47420bdd6d732525e7403e"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.022451 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerStarted","Data":"bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.071398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" event={"ID":"cfe269e5-4b5b-4c08-acd0-2a2218d121f9","Type":"ContainerDied","Data":"9ef3ada754eadcf7f93f4318d7d259dfeee6dbb6d912965144d7d48ff1f35aca"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.071444 4720 scope.go:117] "RemoveContainer" containerID="8d7b3178a9503cf3c70be6d2af65ac2abae3603160af505b1963e47f85c4e8ae" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.071601 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-swpsq" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.105512 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sbng7" podStartSLOduration=3.105495672 podStartE2EDuration="3.105495672s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:56.096027767 +0000 UTC m=+1114.004767699" watchObservedRunningTime="2026-01-21 14:47:56.105495672 +0000 UTC m=+1114.014235604" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.119488 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerStarted","Data":"acf7d0f328c5178aaf28b6696e5f846b6ac605dd876f51d9605333f5abd8e705"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.128683 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerStarted","Data":"4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.130587 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerStarted","Data":"45355f1df1e84e368508f9990990c5cc32fac5fefbdc4cf2346851c86675e699"} Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.149788 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fhvrr" podStartSLOduration=3.149770554 podStartE2EDuration="3.149770554s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:56.145382131 +0000 UTC m=+1114.054122053" watchObservedRunningTime="2026-01-21 14:47:56.149770554 +0000 UTC m=+1114.058510486" Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.206883 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.231784 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-swpsq"] Jan 21 14:47:56 crc kubenswrapper[4720]: I0121 14:47:56.706345 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" path="/var/lib/kubelet/pods/cfe269e5-4b5b-4c08-acd0-2a2218d121f9/volumes" Jan 21 14:47:57 crc kubenswrapper[4720]: I0121 14:47:57.141066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerStarted","Data":"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17"} Jan 21 14:47:57 crc kubenswrapper[4720]: I0121 14:47:57.142369 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:47:57 crc kubenswrapper[4720]: I0121 14:47:57.163589 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" podStartSLOduration=4.163116935 podStartE2EDuration="4.163116935s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:47:57.161703958 +0000 UTC m=+1115.070443890" watchObservedRunningTime="2026-01-21 14:47:57.163116935 +0000 UTC m=+1115.071856857" Jan 21 14:48:01 crc kubenswrapper[4720]: I0121 14:48:01.203346 4720 generic.go:334] "Generic (PLEG): container finished" podID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerID="bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f" exitCode=0 Jan 21 14:48:01 crc kubenswrapper[4720]: I0121 14:48:01.203380 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerDied","Data":"bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f"} Jan 21 14:48:04 crc kubenswrapper[4720]: I0121 14:48:04.360463 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:48:04 crc kubenswrapper[4720]: I0121 14:48:04.416893 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:48:04 crc kubenswrapper[4720]: I0121 14:48:04.417168 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" containerID="cri-o://e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666" gracePeriod=10 Jan 21 14:48:05 crc kubenswrapper[4720]: I0121 14:48:05.238103 4720 generic.go:334] "Generic (PLEG): container finished" podID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerID="e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666" exitCode=0 Jan 21 14:48:05 crc kubenswrapper[4720]: I0121 14:48:05.238145 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerDied","Data":"e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666"} Jan 21 14:48:05 crc kubenswrapper[4720]: I0121 14:48:05.544854 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.629024 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656540 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656621 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656691 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656721 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656745 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.656809 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") pod \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\" (UID: \"2db18bae-9cc2-4e10-b04c-edd0bb539b79\") " Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.701009 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.701027 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.703091 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8" (OuterVolumeSpecName: "kube-api-access-sbvd8") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "kube-api-access-sbvd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.700737 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts" (OuterVolumeSpecName: "scripts") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.716011 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data" (OuterVolumeSpecName: "config-data") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.728886 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2db18bae-9cc2-4e10-b04c-edd0bb539b79" (UID: "2db18bae-9cc2-4e10-b04c-edd0bb539b79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758275 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758313 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbvd8\" (UniqueName: \"kubernetes.io/projected/2db18bae-9cc2-4e10-b04c-edd0bb539b79-kube-api-access-sbvd8\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758327 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758341 4720 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758352 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:06 crc kubenswrapper[4720]: I0121 14:48:06.758363 4720 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2db18bae-9cc2-4e10-b04c-edd0bb539b79-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.265747 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sbng7" event={"ID":"2db18bae-9cc2-4e10-b04c-edd0bb539b79","Type":"ContainerDied","Data":"8e1d6ab9eaf9f02c816601f4cd4ee4cc6e910d8f782825a76f18664eab273126"} Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.266063 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e1d6ab9eaf9f02c816601f4cd4ee4cc6e910d8f782825a76f18664eab273126" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.265886 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sbng7" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.743980 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.750903 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sbng7"] Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.805354 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:48:07 crc kubenswrapper[4720]: E0121 14:48:07.805905 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerName="init" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.805918 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerName="init" Jan 21 14:48:07 crc kubenswrapper[4720]: E0121 14:48:07.805936 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerName="keystone-bootstrap" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.805943 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerName="keystone-bootstrap" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.806138 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" containerName="keystone-bootstrap" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.806159 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe269e5-4b5b-4c08-acd0-2a2218d121f9" containerName="init" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.806715 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.809301 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.809521 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.809644 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.810322 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.810613 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.815004 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988116 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988198 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988270 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988342 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988371 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:07 crc kubenswrapper[4720]: I0121 14:48:07.988397 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090053 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090577 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090643 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090730 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090757 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.090776 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.096305 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.097087 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.099991 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.104824 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.107899 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.114057 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"keystone-bootstrap-lsn2k\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.133917 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:08 crc kubenswrapper[4720]: I0121 14:48:08.692677 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db18bae-9cc2-4e10-b04c-edd0bb539b79" path="/var/lib/kubelet/pods/2db18bae-9cc2-4e10-b04c-edd0bb539b79/volumes" Jan 21 14:48:10 crc kubenswrapper[4720]: I0121 14:48:10.545304 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 21 14:48:15 crc kubenswrapper[4720]: E0121 14:48:15.541633 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 21 14:48:15 crc kubenswrapper[4720]: E0121 14:48:15.542165 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqjpm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-wtr5d_openstack(2eaf7930-34cf-4396-9b94-c09d3a5da09a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:48:15 crc kubenswrapper[4720]: E0121 14:48:15.544227 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-wtr5d" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" Jan 21 14:48:15 crc kubenswrapper[4720]: I0121 14:48:15.545680 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Jan 21 14:48:15 crc kubenswrapper[4720]: I0121 14:48:15.545789 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.385287 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-wtr5d" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.766732 4720 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.767174 4720 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85gwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-vz5k2_openstack(d468a637-b18d-47fd-9b04-910dba72a955): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 21 14:48:16 crc kubenswrapper[4720]: E0121 14:48:16.768492 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-vz5k2" podUID="d468a637-b18d-47fd-9b04-910dba72a955" Jan 21 14:48:16 crc kubenswrapper[4720]: I0121 14:48:16.929684 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.059637 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.059974 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.060024 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.060075 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.060094 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") pod \"e7c8b46b-d758-4538-a345-21ccc71aabe4\" (UID: \"e7c8b46b-d758-4538-a345-21ccc71aabe4\") " Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.071010 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l" (OuterVolumeSpecName: "kube-api-access-6v27l") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "kube-api-access-6v27l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.105445 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.107148 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.111022 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config" (OuterVolumeSpecName: "config") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.117034 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7c8b46b-d758-4538-a345-21ccc71aabe4" (UID: "e7c8b46b-d758-4538-a345-21ccc71aabe4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161733 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v27l\" (UniqueName: \"kubernetes.io/projected/e7c8b46b-d758-4538-a345-21ccc71aabe4-kube-api-access-6v27l\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161769 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161781 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161792 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.161806 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7c8b46b-d758-4538-a345-21ccc71aabe4-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.175006 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:48:17 crc kubenswrapper[4720]: W0121 14:48:17.185751 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03e400cd_53d2_4738_96f0_75829e339879.slice/crio-914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8 WatchSource:0}: Error finding container 914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8: Status 404 returned error can't find the container with id 914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8 Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.391795 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" event={"ID":"e7c8b46b-d758-4538-a345-21ccc71aabe4","Type":"ContainerDied","Data":"e1f0107509de082453c41da25f574e86b230a27ade7a48580805b9ac2072e586"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.391827 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-7dx9d" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.392139 4720 scope.go:117] "RemoveContainer" containerID="e0a764a335a04c77ad3eddd776f3c96760f97db3ae9ca0277c3d3c8d73645666" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.398043 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.402030 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerStarted","Data":"4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.402062 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerStarted","Data":"914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.406257 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerStarted","Data":"e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130"} Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.416047 4720 scope.go:117] "RemoveContainer" containerID="856c89063d4a8ffd0dea2ed8327f087d5552a6933c06f250049624af9b370e87" Jan 21 14:48:17 crc kubenswrapper[4720]: E0121 14:48:17.416431 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-vz5k2" podUID="d468a637-b18d-47fd-9b04-910dba72a955" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.435765 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lsn2k" podStartSLOduration=10.435737429 podStartE2EDuration="10.435737429s" podCreationTimestamp="2026-01-21 14:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:17.428220815 +0000 UTC m=+1135.336960787" watchObservedRunningTime="2026-01-21 14:48:17.435737429 +0000 UTC m=+1135.344477391" Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.477574 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.487251 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-7dx9d"] Jan 21 14:48:17 crc kubenswrapper[4720]: I0121 14:48:17.493022 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fh44q" podStartSLOduration=2.956255761 podStartE2EDuration="24.493003148s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:55.188357806 +0000 UTC m=+1113.097097728" lastFinishedPulling="2026-01-21 14:48:16.725105183 +0000 UTC m=+1134.633845115" observedRunningTime="2026-01-21 14:48:17.485113494 +0000 UTC m=+1135.393853446" watchObservedRunningTime="2026-01-21 14:48:17.493003148 +0000 UTC m=+1135.401743080" Jan 21 14:48:18 crc kubenswrapper[4720]: I0121 14:48:18.418413 4720 generic.go:334] "Generic (PLEG): container finished" podID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerID="4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07" exitCode=0 Jan 21 14:48:18 crc kubenswrapper[4720]: I0121 14:48:18.419700 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerDied","Data":"4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07"} Jan 21 14:48:18 crc kubenswrapper[4720]: I0121 14:48:18.690197 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" path="/var/lib/kubelet/pods/e7c8b46b-d758-4538-a345-21ccc71aabe4/volumes" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.427965 4720 generic.go:334] "Generic (PLEG): container finished" podID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerID="e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130" exitCode=0 Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.428324 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerDied","Data":"e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130"} Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.438884 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23"} Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.760795 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.845922 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") pod \"7a6c6de6-8f88-4c87-bd8e-46579996948e\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.845994 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") pod \"7a6c6de6-8f88-4c87-bd8e-46579996948e\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.846082 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") pod \"7a6c6de6-8f88-4c87-bd8e-46579996948e\" (UID: \"7a6c6de6-8f88-4c87-bd8e-46579996948e\") " Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.851975 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq" (OuterVolumeSpecName: "kube-api-access-fb2pq") pod "7a6c6de6-8f88-4c87-bd8e-46579996948e" (UID: "7a6c6de6-8f88-4c87-bd8e-46579996948e"). InnerVolumeSpecName "kube-api-access-fb2pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.875837 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config" (OuterVolumeSpecName: "config") pod "7a6c6de6-8f88-4c87-bd8e-46579996948e" (UID: "7a6c6de6-8f88-4c87-bd8e-46579996948e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.884017 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a6c6de6-8f88-4c87-bd8e-46579996948e" (UID: "7a6c6de6-8f88-4c87-bd8e-46579996948e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.947197 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.947236 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2pq\" (UniqueName: \"kubernetes.io/projected/7a6c6de6-8f88-4c87-bd8e-46579996948e-kube-api-access-fb2pq\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:19 crc kubenswrapper[4720]: I0121 14:48:19.947247 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7a6c6de6-8f88-4c87-bd8e-46579996948e-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.450867 4720 generic.go:334] "Generic (PLEG): container finished" podID="03e400cd-53d2-4738-96f0-75829e339879" containerID="4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91" exitCode=0 Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.450961 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerDied","Data":"4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91"} Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.457819 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fhvrr" event={"ID":"7a6c6de6-8f88-4c87-bd8e-46579996948e","Type":"ContainerDied","Data":"1122f0765aa0595857830e6b23a4de080fa5792bca4d2fa6099c5367b12ae450"} Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.457858 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1122f0765aa0595857830e6b23a4de080fa5792bca4d2fa6099c5367b12ae450" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.457981 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fhvrr" Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.625626 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a6c6de6_8f88_4c87_bd8e_46579996948e.slice\": RecentStats: unable to find data in memory cache]" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649058 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.649356 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerName="neutron-db-sync" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649369 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerName="neutron-db-sync" Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.649377 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649385 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" Jan 21 14:48:20 crc kubenswrapper[4720]: E0121 14:48:20.649394 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="init" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649400 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="init" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649544 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" containerName="neutron-db-sync" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.649554 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c8b46b-d758-4538-a345-21ccc71aabe4" containerName="dnsmasq-dns" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.650279 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.674549 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777467 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777722 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777901 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.777996 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.778117 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.880540 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881249 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881457 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881683 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.881884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.883590 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.884194 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.884499 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.884838 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.918769 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.920061 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.931667 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"dnsmasq-dns-7b946d459c-99cnk\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.944347 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.944553 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.945021 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.945125 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-r779p" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.970638 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987071 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987132 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987176 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987235 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:20 crc kubenswrapper[4720]: I0121 14:48:20.987262 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.006835 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089580 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089635 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089687 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089740 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.089758 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.098159 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.103030 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.103707 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.133514 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.133527 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"neutron-7f78c5dfcb-hsblf\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.238202 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.323023 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394580 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394734 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394800 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394840 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.394857 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") pod \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\" (UID: \"72a4a042-08eb-4644-81c0-2cfcd105cf2b\") " Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.395441 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs" (OuterVolumeSpecName: "logs") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.402935 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts" (OuterVolumeSpecName: "scripts") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.413799 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g" (OuterVolumeSpecName: "kube-api-access-xpq8g") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "kube-api-access-xpq8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.431381 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.449263 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data" (OuterVolumeSpecName: "config-data") pod "72a4a042-08eb-4644-81c0-2cfcd105cf2b" (UID: "72a4a042-08eb-4644-81c0-2cfcd105cf2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498459 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498498 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpq8g\" (UniqueName: \"kubernetes.io/projected/72a4a042-08eb-4644-81c0-2cfcd105cf2b-kube-api-access-xpq8g\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498510 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498520 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/72a4a042-08eb-4644-81c0-2cfcd105cf2b-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.498531 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72a4a042-08eb-4644-81c0-2cfcd105cf2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.517514 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.521819 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fh44q" Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.522071 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fh44q" event={"ID":"72a4a042-08eb-4644-81c0-2cfcd105cf2b","Type":"ContainerDied","Data":"45355f1df1e84e368508f9990990c5cc32fac5fefbdc4cf2346851c86675e699"} Jan 21 14:48:21 crc kubenswrapper[4720]: I0121 14:48:21.522124 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45355f1df1e84e368508f9990990c5cc32fac5fefbdc4cf2346851c86675e699" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.414371 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8648996d7d-4f2q4"] Jan 21 14:48:22 crc kubenswrapper[4720]: E0121 14:48:22.415771 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerName="placement-db-sync" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.415794 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerName="placement-db-sync" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.415940 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" containerName="placement-db-sync" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.416957 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.424543 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.424936 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.425107 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.425116 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.425258 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7rq7c" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.430747 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8648996d7d-4f2q4"] Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521078 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-config-data\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521144 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-logs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521161 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-scripts\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521176 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drwbw\" (UniqueName: \"kubernetes.io/projected/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-kube-api-access-drwbw\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521209 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-internal-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521231 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-public-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.521254 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-combined-ca-bundle\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622132 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-combined-ca-bundle\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622207 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-config-data\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622258 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-logs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622278 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-scripts\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622294 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drwbw\" (UniqueName: \"kubernetes.io/projected/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-kube-api-access-drwbw\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622327 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-internal-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.622352 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-public-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.625911 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-logs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.629922 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-config-data\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.630691 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-combined-ca-bundle\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.637020 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-scripts\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.641705 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-internal-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.649851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drwbw\" (UniqueName: \"kubernetes.io/projected/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-kube-api-access-drwbw\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.653104 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e9aac3-9710-4d1c-88a7-1a0a22b5a593-public-tls-certs\") pod \"placement-8648996d7d-4f2q4\" (UID: \"37e9aac3-9710-4d1c-88a7-1a0a22b5a593\") " pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:22 crc kubenswrapper[4720]: I0121 14:48:22.760764 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.212181 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c8b4f85f7-4kz9x"] Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.213754 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.216977 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.217329 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.257347 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8b4f85f7-4kz9x"] Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.354849 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.354970 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-httpd-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356020 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-ovndb-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356398 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-combined-ca-bundle\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356482 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9bj\" (UniqueName: \"kubernetes.io/projected/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-kube-api-access-fw9bj\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356513 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-internal-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.356563 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-public-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.457996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-ovndb-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458045 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-combined-ca-bundle\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458073 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9bj\" (UniqueName: \"kubernetes.io/projected/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-kube-api-access-fw9bj\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458093 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-internal-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458124 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-public-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458143 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.458203 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-httpd-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.463951 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-ovndb-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.465626 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-combined-ca-bundle\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.468223 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.472937 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-public-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.475301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-httpd-config\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.475581 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-internal-tls-certs\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.477546 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9bj\" (UniqueName: \"kubernetes.io/projected/7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7-kube-api-access-fw9bj\") pod \"neutron-6c8b4f85f7-4kz9x\" (UID: \"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7\") " pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:24 crc kubenswrapper[4720]: I0121 14:48:24.535372 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:30 crc kubenswrapper[4720]: I0121 14:48:30.297621 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372440 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372478 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372520 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372600 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372640 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.372692 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") pod \"03e400cd-53d2-4738-96f0-75829e339879\" (UID: \"03e400cd-53d2-4738-96f0-75829e339879\") " Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389151 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s" (OuterVolumeSpecName: "kube-api-access-mnr6s") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "kube-api-access-mnr6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389220 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts" (OuterVolumeSpecName: "scripts") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389286 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.389278 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.438231 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data" (OuterVolumeSpecName: "config-data") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.446786 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03e400cd-53d2-4738-96f0-75829e339879" (UID: "03e400cd-53d2-4738-96f0-75829e339879"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474412 4720 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474611 4720 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474621 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474630 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnr6s\" (UniqueName: \"kubernetes.io/projected/03e400cd-53d2-4738-96f0-75829e339879-kube-api-access-mnr6s\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474639 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.474646 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03e400cd-53d2-4738-96f0-75829e339879-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.607535 4720 generic.go:334] "Generic (PLEG): container finished" podID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerID="6bd3c4506512bee2427c44ab8e73cd801e736b9aa463cb2377da3847b6955208" exitCode=0 Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.607586 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerDied","Data":"6bd3c4506512bee2427c44ab8e73cd801e736b9aa463cb2377da3847b6955208"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.607609 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerStarted","Data":"63bb1ec50feeba6e58d30594b8e6156c8d4ea90e8cd1ba58353f3003db3dc734"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.612357 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.614766 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lsn2k" event={"ID":"03e400cd-53d2-4738-96f0-75829e339879","Type":"ContainerDied","Data":"914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.614793 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lsn2k" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:30.614795 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="914afd90c63529f4374d55c95eb1d303cd6e2f2553422262b39f93a0e25feee8" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.580565 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-69cc8766db-gdch7"] Jan 21 14:48:31 crc kubenswrapper[4720]: E0121 14:48:31.581143 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e400cd-53d2-4738-96f0-75829e339879" containerName="keystone-bootstrap" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.581155 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e400cd-53d2-4738-96f0-75829e339879" containerName="keystone-bootstrap" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.581286 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e400cd-53d2-4738-96f0-75829e339879" containerName="keystone-bootstrap" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.581789 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.591366 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.591535 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pb4wb" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.592012 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.592113 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.592210 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.594069 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.634533 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-combined-ca-bundle\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635523 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-config-data\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635554 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-internal-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635581 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw78t\" (UniqueName: \"kubernetes.io/projected/0edd5078-75bc-4823-b52f-ad5effeace06-kube-api-access-qw78t\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635637 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-fernet-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635678 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-scripts\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635696 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-credential-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.635797 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-public-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.641704 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69cc8766db-gdch7"] Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.652431 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.676637 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerStarted","Data":"7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.677744 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.680148 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerStarted","Data":"da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38"} Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.704546 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8648996d7d-4f2q4"] Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.729521 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" podStartSLOduration=11.729507035 podStartE2EDuration="11.729507035s" podCreationTimestamp="2026-01-21 14:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:31.72928086 +0000 UTC m=+1149.638020802" watchObservedRunningTime="2026-01-21 14:48:31.729507035 +0000 UTC m=+1149.638246967" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737635 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw78t\" (UniqueName: \"kubernetes.io/projected/0edd5078-75bc-4823-b52f-ad5effeace06-kube-api-access-qw78t\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737826 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-fernet-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737893 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-scripts\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737944 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-credential-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.737993 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-public-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.738158 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-combined-ca-bundle\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.738211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-config-data\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.738249 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-internal-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.776068 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-scripts\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.776416 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-public-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.776462 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-credential-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.784512 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-config-data\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.784533 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-fernet-keys\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.784876 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-internal-tls-certs\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.785482 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw78t\" (UniqueName: \"kubernetes.io/projected/0edd5078-75bc-4823-b52f-ad5effeace06-kube-api-access-qw78t\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.800301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edd5078-75bc-4823-b52f-ad5effeace06-combined-ca-bundle\") pod \"keystone-69cc8766db-gdch7\" (UID: \"0edd5078-75bc-4823-b52f-ad5effeace06\") " pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.812191 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-vz5k2" podStartSLOduration=3.603836878 podStartE2EDuration="38.812175719s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:55.096847243 +0000 UTC m=+1113.005587175" lastFinishedPulling="2026-01-21 14:48:30.305186084 +0000 UTC m=+1148.213926016" observedRunningTime="2026-01-21 14:48:31.785161092 +0000 UTC m=+1149.693901024" watchObservedRunningTime="2026-01-21 14:48:31.812175719 +0000 UTC m=+1149.720915651" Jan 21 14:48:31 crc kubenswrapper[4720]: I0121 14:48:31.927648 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c8b4f85f7-4kz9x"] Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.055469 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.639257 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-69cc8766db-gdch7"] Jan 21 14:48:32 crc kubenswrapper[4720]: W0121 14:48:32.648043 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0edd5078_75bc_4823_b52f_ad5effeace06.slice/crio-cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b WatchSource:0}: Error finding container cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b: Status 404 returned error can't find the container with id cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698533 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8648996d7d-4f2q4" event={"ID":"37e9aac3-9710-4d1c-88a7-1a0a22b5a593","Type":"ContainerStarted","Data":"8237450b616f813ffba74d9888f3ca2e07afe776f646df699e7891cc5569f709"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698804 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8648996d7d-4f2q4" event={"ID":"37e9aac3-9710-4d1c-88a7-1a0a22b5a593","Type":"ContainerStarted","Data":"0d503bdd2ab088d4b834ba24a297b1d2f6ddf36e67e8327ac4567dab605f8dc6"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698814 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8648996d7d-4f2q4" event={"ID":"37e9aac3-9710-4d1c-88a7-1a0a22b5a593","Type":"ContainerStarted","Data":"25b1b940e7a3af70b2d71e8f9b26e1a866ed65c52a96efa0a6d7b8493de2ed4b"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698845 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.698864 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.700627 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerStarted","Data":"aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.703359 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8b4f85f7-4kz9x" event={"ID":"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7","Type":"ContainerStarted","Data":"25b8ba9f43d4e200b4217f3ee0a98bd5fc4bebff3320e0c0fe7a54f03c943bc4"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.703396 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8b4f85f7-4kz9x" event={"ID":"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7","Type":"ContainerStarted","Data":"5e5dd967c7901c33d6bc8eceb49611162cac7ed45b2d8530b7a6e9383438987d"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.703407 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c8b4f85f7-4kz9x" event={"ID":"7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7","Type":"ContainerStarted","Data":"232752c2f62e2e7c660628d602ace55a54c7920d10aa16bb08ffd899712eedf3"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.705452 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.710798 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerStarted","Data":"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.710836 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerStarted","Data":"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.710847 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerStarted","Data":"68237d054d159a4765c532ce618148ffe9f3f4fde0b73c4c9dd1d07a506b6603"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.711844 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.713041 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69cc8766db-gdch7" event={"ID":"0edd5078-75bc-4823-b52f-ad5effeace06","Type":"ContainerStarted","Data":"cc1ef91ef553b9d691f09c291df3e53455ca32fa4f02406270b1b0f98eeb456b"} Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.816627 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c8b4f85f7-4kz9x" podStartSLOduration=8.816611839 podStartE2EDuration="8.816611839s" podCreationTimestamp="2026-01-21 14:48:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:32.812119293 +0000 UTC m=+1150.720859235" watchObservedRunningTime="2026-01-21 14:48:32.816611839 +0000 UTC m=+1150.725351771" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.840354 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f78c5dfcb-hsblf" podStartSLOduration=12.840337542 podStartE2EDuration="12.840337542s" podCreationTimestamp="2026-01-21 14:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:32.838346621 +0000 UTC m=+1150.747086553" watchObservedRunningTime="2026-01-21 14:48:32.840337542 +0000 UTC m=+1150.749077474" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.869056 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wtr5d" podStartSLOduration=3.315160454 podStartE2EDuration="39.869040203s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:54.878501136 +0000 UTC m=+1112.787241068" lastFinishedPulling="2026-01-21 14:48:31.432380885 +0000 UTC m=+1149.341120817" observedRunningTime="2026-01-21 14:48:32.864065864 +0000 UTC m=+1150.772805796" watchObservedRunningTime="2026-01-21 14:48:32.869040203 +0000 UTC m=+1150.777780135" Jan 21 14:48:32 crc kubenswrapper[4720]: I0121 14:48:32.894241 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8648996d7d-4f2q4" podStartSLOduration=10.894224433 podStartE2EDuration="10.894224433s" podCreationTimestamp="2026-01-21 14:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:32.889980684 +0000 UTC m=+1150.798720626" watchObservedRunningTime="2026-01-21 14:48:32.894224433 +0000 UTC m=+1150.802964365" Jan 21 14:48:33 crc kubenswrapper[4720]: I0121 14:48:33.721246 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-69cc8766db-gdch7" event={"ID":"0edd5078-75bc-4823-b52f-ad5effeace06","Type":"ContainerStarted","Data":"4618462f6d880a57ca93707068b944d2fd4d41e47ea3695e54afec3affa91c60"} Jan 21 14:48:33 crc kubenswrapper[4720]: I0121 14:48:33.741274 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-69cc8766db-gdch7" podStartSLOduration=2.74125857 podStartE2EDuration="2.74125857s" podCreationTimestamp="2026-01-21 14:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:33.739026432 +0000 UTC m=+1151.647766364" watchObservedRunningTime="2026-01-21 14:48:33.74125857 +0000 UTC m=+1151.649998502" Jan 21 14:48:34 crc kubenswrapper[4720]: I0121 14:48:34.731780 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.009438 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.103679 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.104142 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" containerID="cri-o://807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" gracePeriod=10 Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.556467 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642315 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642413 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642458 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642568 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.642587 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") pod \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\" (UID: \"6a60b31b-eca6-4e2d-8dcd-0097033a8a35\") " Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.659980 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f" (OuterVolumeSpecName: "kube-api-access-2bv6f") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "kube-api-access-2bv6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.690192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.690894 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.700389 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.710785 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config" (OuterVolumeSpecName: "config") pod "6a60b31b-eca6-4e2d-8dcd-0097033a8a35" (UID: "6a60b31b-eca6-4e2d-8dcd-0097033a8a35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.744331 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745444 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745540 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bv6f\" (UniqueName: \"kubernetes.io/projected/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-kube-api-access-2bv6f\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745602 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745674 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a60b31b-eca6-4e2d-8dcd-0097033a8a35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745619 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerDied","Data":"aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234"} Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.745598 4720 generic.go:334] "Generic (PLEG): container finished" podID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerID="aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234" exitCode=0 Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749236 4720 generic.go:334] "Generic (PLEG): container finished" podID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" exitCode=0 Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749274 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerDied","Data":"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17"} Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749299 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" event={"ID":"6a60b31b-eca6-4e2d-8dcd-0097033a8a35","Type":"ContainerDied","Data":"f174f6fdc28ef67dfde4e4bef9624d4a07461dcbbd47420bdd6d732525e7403e"} Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749316 4720 scope.go:117] "RemoveContainer" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.749434 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-fw4gh" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.792149 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.799636 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-fw4gh"] Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.800454 4720 scope.go:117] "RemoveContainer" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.830260 4720 scope.go:117] "RemoveContainer" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" Jan 21 14:48:36 crc kubenswrapper[4720]: E0121 14:48:36.830861 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17\": container with ID starting with 807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17 not found: ID does not exist" containerID="807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.830908 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17"} err="failed to get container status \"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17\": rpc error: code = NotFound desc = could not find container \"807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17\": container with ID starting with 807889b31df7e2b2850227042ad536d3c2a4d3c7125a563f44e02b57c276cf17 not found: ID does not exist" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.830934 4720 scope.go:117] "RemoveContainer" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" Jan 21 14:48:36 crc kubenswrapper[4720]: E0121 14:48:36.831613 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868\": container with ID starting with e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868 not found: ID does not exist" containerID="e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868" Jan 21 14:48:36 crc kubenswrapper[4720]: I0121 14:48:36.831681 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868"} err="failed to get container status \"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868\": rpc error: code = NotFound desc = could not find container \"e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868\": container with ID starting with e46f112e6ec1160fcf32ab53a073a80fc27df442931a4ca3b0306e8fe8981868 not found: ID does not exist" Jan 21 14:48:37 crc kubenswrapper[4720]: I0121 14:48:37.758910 4720 generic.go:334] "Generic (PLEG): container finished" podID="d468a637-b18d-47fd-9b04-910dba72a955" containerID="da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38" exitCode=0 Jan 21 14:48:37 crc kubenswrapper[4720]: I0121 14:48:37.758981 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerDied","Data":"da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38"} Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.099999 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.201449 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") pod \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.201754 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") pod \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.201775 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") pod \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\" (UID: \"2eaf7930-34cf-4396-9b94-c09d3a5da09a\") " Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.209882 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2eaf7930-34cf-4396-9b94-c09d3a5da09a" (UID: "2eaf7930-34cf-4396-9b94-c09d3a5da09a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.210856 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm" (OuterVolumeSpecName: "kube-api-access-sqjpm") pod "2eaf7930-34cf-4396-9b94-c09d3a5da09a" (UID: "2eaf7930-34cf-4396-9b94-c09d3a5da09a"). InnerVolumeSpecName "kube-api-access-sqjpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.234917 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eaf7930-34cf-4396-9b94-c09d3a5da09a" (UID: "2eaf7930-34cf-4396-9b94-c09d3a5da09a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.303384 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjpm\" (UniqueName: \"kubernetes.io/projected/2eaf7930-34cf-4396-9b94-c09d3a5da09a-kube-api-access-sqjpm\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.303648 4720 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.303679 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eaf7930-34cf-4396-9b94-c09d3a5da09a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.694024 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" path="/var/lib/kubelet/pods/6a60b31b-eca6-4e2d-8dcd-0097033a8a35/volumes" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.770849 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wtr5d" Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.770846 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wtr5d" event={"ID":"2eaf7930-34cf-4396-9b94-c09d3a5da09a","Type":"ContainerDied","Data":"2b7bc7ab0da041d56c2df3e5b44877c1c08f4a7afb2d85c0210b22ee47e43e82"} Jan 21 14:48:38 crc kubenswrapper[4720]: I0121 14:48:38.770889 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b7bc7ab0da041d56c2df3e5b44877c1c08f4a7afb2d85c0210b22ee47e43e82" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062506 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-8f88c9d47-m5rzn"] Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.062775 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerName="barbican-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062787 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerName="barbican-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.062805 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="init" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062811 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="init" Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.062829 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.062835 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.063665 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" containerName="barbican-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.063685 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a60b31b-eca6-4e2d-8dcd-0097033a8a35" containerName="dnsmasq-dns" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.064438 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.067159 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.067430 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.067531 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9pjf9" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.131367 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.158432 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6898c4b994-dn9qn"] Jan 21 14:48:39 crc kubenswrapper[4720]: E0121 14:48:39.179783 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d468a637-b18d-47fd-9b04-910dba72a955" containerName="cinder-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.179818 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d468a637-b18d-47fd-9b04-910dba72a955" containerName="cinder-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.180106 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d468a637-b18d-47fd-9b04-910dba72a955" containerName="cinder-db-sync" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.180982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.197298 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8f88c9d47-m5rzn"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.197537 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.209056 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6898c4b994-dn9qn"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221708 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9355d502-bf01-4465-996d-483d99b92954-kube-api-access-dtmsn\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221784 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-combined-ca-bundle\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221804 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data-custom\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221820 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.221891 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9355d502-bf01-4465-996d-483d99b92954-logs\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.245052 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.270236 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.270425 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.323720 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.323923 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.323952 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.324017 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.324047 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.324122 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") pod \"d468a637-b18d-47fd-9b04-910dba72a955\" (UID: \"d468a637-b18d-47fd-9b04-910dba72a955\") " Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327235 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data-custom\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327302 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqd9\" (UniqueName: \"kubernetes.io/projected/bb475766-6891-454b-8f7e-1494d9806891-kube-api-access-fwqd9\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327321 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327358 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb475766-6891-454b-8f7e-1494d9806891-logs\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327382 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9355d502-bf01-4465-996d-483d99b92954-kube-api-access-dtmsn\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327491 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-combined-ca-bundle\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327509 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data-custom\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327529 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327638 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-combined-ca-bundle\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.327698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9355d502-bf01-4465-996d-483d99b92954-logs\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.328197 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9355d502-bf01-4465-996d-483d99b92954-logs\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.332834 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.338825 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts" (OuterVolumeSpecName: "scripts") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.339900 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-combined-ca-bundle\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.343826 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv" (OuterVolumeSpecName: "kube-api-access-85gwv") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "kube-api-access-85gwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.348998 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.349035 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.362382 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9355d502-bf01-4465-996d-483d99b92954-config-data-custom\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.387480 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.388646 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.396076 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.408171 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtmsn\" (UniqueName: \"kubernetes.io/projected/9355d502-bf01-4465-996d-483d99b92954-kube-api-access-dtmsn\") pod \"barbican-worker-8f88c9d47-m5rzn\" (UID: \"9355d502-bf01-4465-996d-483d99b92954\") " pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.417501 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429142 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429196 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data-custom\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429217 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429234 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429250 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqd9\" (UniqueName: \"kubernetes.io/projected/bb475766-6891-454b-8f7e-1494d9806891-kube-api-access-fwqd9\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429274 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429288 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb475766-6891-454b-8f7e-1494d9806891-logs\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429338 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429363 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429405 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-combined-ca-bundle\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429450 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429460 4720 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d468a637-b18d-47fd-9b04-910dba72a955-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429469 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429478 4720 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.429486 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85gwv\" (UniqueName: \"kubernetes.io/projected/d468a637-b18d-47fd-9b04-910dba72a955-kube-api-access-85gwv\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.433135 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-combined-ca-bundle\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.433404 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bb475766-6891-454b-8f7e-1494d9806891-logs\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.443575 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-8f88c9d47-m5rzn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.448315 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data-custom\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.460383 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data" (OuterVolumeSpecName: "config-data") pod "d468a637-b18d-47fd-9b04-910dba72a955" (UID: "d468a637-b18d-47fd-9b04-910dba72a955"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.475018 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb475766-6891-454b-8f7e-1494d9806891-config-data\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.480722 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.514457 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqd9\" (UniqueName: \"kubernetes.io/projected/bb475766-6891-454b-8f7e-1494d9806891-kube-api-access-fwqd9\") pod \"barbican-keystone-listener-6898c4b994-dn9qn\" (UID: \"bb475766-6891-454b-8f7e-1494d9806891\") " pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.524056 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.530921 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.535805 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.535982 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541110 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541343 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541447 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541637 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.541762 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.542128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.542333 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d468a637-b18d-47fd-9b04-910dba72a955-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.543358 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.544199 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.544271 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.546013 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.557455 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"dnsmasq-dns-6bb684768f-qtbjz\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.594981 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.643584 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644031 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644077 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644103 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644187 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.644825 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.648185 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.651326 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.661234 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.773000 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"barbican-api-7c6979b468-whx5j\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.845113 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.852923 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-vz5k2" event={"ID":"d468a637-b18d-47fd-9b04-910dba72a955","Type":"ContainerDied","Data":"acf7d0f328c5178aaf28b6696e5f846b6ac605dd876f51d9605333f5abd8e705"} Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.852961 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf7d0f328c5178aaf28b6696e5f846b6ac605dd876f51d9605333f5abd8e705" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.853061 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-vz5k2" Jan 21 14:48:39 crc kubenswrapper[4720]: I0121 14:48:39.989934 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.000446 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.013596 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.013816 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-r7487" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.013948 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.014068 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.041679 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.110789 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-8f88c9d47-m5rzn"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166069 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166414 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166463 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166500 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166528 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.166607 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.233788 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.256757 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6898c4b994-dn9qn"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268548 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268612 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268674 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268720 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268774 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.268830 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.269995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.272813 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.274416 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.285802 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.286323 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.290284 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.290582 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.290969 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.309398 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"cinder-scheduler-0\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.379136 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492094 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492361 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492483 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492523 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.492562 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.560037 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.572410 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.572465 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.578953 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593736 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593793 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593827 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.593899 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.594796 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.595318 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.595701 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.595843 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.657303 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"dnsmasq-dns-6d97fcdd8f-7f22v\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729824 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729876 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729900 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.729979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.730048 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.730066 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.730091 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.806201 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.823438 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.833070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834145 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834216 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834291 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834360 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834451 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.834517 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.836673 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.842103 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.852293 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.853012 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.860827 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.877473 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.903916 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8f88c9d47-m5rzn" event={"ID":"9355d502-bf01-4465-996d-483d99b92954","Type":"ContainerStarted","Data":"dc9243e8f48bab6e285da615fb644a56108b7aafede7cddffde36655963464f8"} Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.909570 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" event={"ID":"bb475766-6891-454b-8f7e-1494d9806891","Type":"ContainerStarted","Data":"000a5953f4f4acbb293f3fd20f9c4128f67fc3aedf2d6b521a90bf43134a33df"} Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.916398 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" event={"ID":"4775aea1-f465-4995-a37a-1285ed8229dd","Type":"ContainerStarted","Data":"c620d8e3e9bc430960eba755b5a007861ddf15b09ef6b966ea58b1b2e0f572a3"} Jan 21 14:48:40 crc kubenswrapper[4720]: I0121 14:48:40.936383 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"cinder-api-0\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " pod="openstack/cinder-api-0" Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.153429 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.204889 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.648782 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:48:41 crc kubenswrapper[4720]: W0121 14:48:41.686785 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c32ba08_0c9c_4f0a_b9f4_f56e91ee566e.slice/crio-90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49 WatchSource:0}: Error finding container 90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49: Status 404 returned error can't find the container with id 90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49 Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.844578 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.863113 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.961592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerStarted","Data":"90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49"} Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.975551 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerStarted","Data":"88ba151a7faa9c14ce1ffb6ea4af84a9781883a8fd8016a1bc953df79c7742c2"} Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.977019 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerStarted","Data":"c28c627632464fe77ab3019eb5addeb203e6d652bffef45ba63964d4aabbdd0c"} Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.979456 4720 generic.go:334] "Generic (PLEG): container finished" podID="4775aea1-f465-4995-a37a-1285ed8229dd" containerID="59fd91b37bfcd11f4ff497c598ac3f209fb0f59dbb3d22d1cb6e9955f559e0d1" exitCode=0 Jan 21 14:48:41 crc kubenswrapper[4720]: I0121 14:48:41.979501 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" event={"ID":"4775aea1-f465-4995-a37a-1285ed8229dd","Type":"ContainerDied","Data":"59fd91b37bfcd11f4ff497c598ac3f209fb0f59dbb3d22d1cb6e9955f559e0d1"} Jan 21 14:48:42 crc kubenswrapper[4720]: I0121 14:48:42.008301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerStarted","Data":"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be"} Jan 21 14:48:42 crc kubenswrapper[4720]: I0121 14:48:42.008365 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerStarted","Data":"3d7a3ecb27979a3b44e08cb4019116ada1170edb766f69f4f67fae7e26496dfb"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.011476 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.018552 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" event={"ID":"4775aea1-f465-4995-a37a-1285ed8229dd","Type":"ContainerDied","Data":"c620d8e3e9bc430960eba755b5a007861ddf15b09ef6b966ea58b1b2e0f572a3"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.018591 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c620d8e3e9bc430960eba755b5a007861ddf15b09ef6b966ea58b1b2e0f572a3" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.020429 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerStarted","Data":"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.020575 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.020603 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.021573 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerStarted","Data":"047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.022626 4720 generic.go:334] "Generic (PLEG): container finished" podID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerID="4d8b9a33cc2b4409a467cae14fe05fabf4e1586debbfc3178a4978e092725506" exitCode=0 Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.022669 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerDied","Data":"4d8b9a33cc2b4409a467cae14fe05fabf4e1586debbfc3178a4978e092725506"} Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.030498 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.046539 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c6979b468-whx5j" podStartSLOduration=4.046522812 podStartE2EDuration="4.046522812s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:43.044287744 +0000 UTC m=+1160.953027676" watchObservedRunningTime="2026-01-21 14:48:43.046522812 +0000 UTC m=+1160.955262744" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087631 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087693 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087758 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087800 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.087838 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") pod \"4775aea1-f465-4995-a37a-1285ed8229dd\" (UID: \"4775aea1-f465-4995-a37a-1285ed8229dd\") " Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.116271 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq" (OuterVolumeSpecName: "kube-api-access-gtkhq") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "kube-api-access-gtkhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.124096 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.143848 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.145232 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.161184 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config" (OuterVolumeSpecName: "config") pod "4775aea1-f465-4995-a37a-1285ed8229dd" (UID: "4775aea1-f465-4995-a37a-1285ed8229dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.189994 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190030 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190040 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190050 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4775aea1-f465-4995-a37a-1285ed8229dd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:43 crc kubenswrapper[4720]: I0121 14:48:43.190060 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtkhq\" (UniqueName: \"kubernetes.io/projected/4775aea1-f465-4995-a37a-1285ed8229dd-kube-api-access-gtkhq\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.029834 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-qtbjz" Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.082549 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.104776 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-qtbjz"] Jan 21 14:48:44 crc kubenswrapper[4720]: I0121 14:48:44.690542 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" path="/var/lib/kubelet/pods/4775aea1-f465-4995-a37a-1285ed8229dd/volumes" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.936049 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5f448c69d6-sjp2r"] Jan 21 14:48:45 crc kubenswrapper[4720]: E0121 14:48:45.937097 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" containerName="init" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.937179 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" containerName="init" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.937436 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4775aea1-f465-4995-a37a-1285ed8229dd" containerName="init" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.938368 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.941729 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.942995 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 21 14:48:45 crc kubenswrapper[4720]: I0121 14:48:45.985285 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f448c69d6-sjp2r"] Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038247 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnjzn\" (UniqueName: \"kubernetes.io/projected/3b177763-3020-4854-b45a-43d99221c670-kube-api-access-bnjzn\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038306 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data-custom\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038372 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-combined-ca-bundle\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038392 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-internal-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038426 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038454 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-public-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.038476 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b177763-3020-4854-b45a-43d99221c670-logs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139510 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data-custom\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139919 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-combined-ca-bundle\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139949 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-internal-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.139996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.140035 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-public-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.140063 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b177763-3020-4854-b45a-43d99221c670-logs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.140129 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnjzn\" (UniqueName: \"kubernetes.io/projected/3b177763-3020-4854-b45a-43d99221c670-kube-api-access-bnjzn\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.144882 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b177763-3020-4854-b45a-43d99221c670-logs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.146245 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data-custom\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.147258 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-internal-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.147772 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-config-data\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.148222 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-combined-ca-bundle\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.149191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b177763-3020-4854-b45a-43d99221c670-public-tls-certs\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.158327 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnjzn\" (UniqueName: \"kubernetes.io/projected/3b177763-3020-4854-b45a-43d99221c670-kube-api-access-bnjzn\") pod \"barbican-api-5f448c69d6-sjp2r\" (UID: \"3b177763-3020-4854-b45a-43d99221c670\") " pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:46 crc kubenswrapper[4720]: I0121 14:48:46.294477 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.091939 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerStarted","Data":"332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95"} Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.092888 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.098135 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8f88c9d47-m5rzn" event={"ID":"9355d502-bf01-4465-996d-483d99b92954","Type":"ContainerStarted","Data":"17d4eadec8ff63e347f4a38724652faff4a563967bcc0c5d98afe8db8947c21c"} Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.114568 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" podStartSLOduration=11.114554914 podStartE2EDuration="11.114554914s" podCreationTimestamp="2026-01-21 14:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:51.111829704 +0000 UTC m=+1169.020569626" watchObservedRunningTime="2026-01-21 14:48:51.114554914 +0000 UTC m=+1169.023294846" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.175616 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5f448c69d6-sjp2r"] Jan 21 14:48:51 crc kubenswrapper[4720]: W0121 14:48:51.191535 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b177763_3020_4854_b45a_43d99221c670.slice/crio-dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8 WatchSource:0}: Error finding container dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8: Status 404 returned error can't find the container with id dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8 Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.375309 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:48:51 crc kubenswrapper[4720]: I0121 14:48:51.880689 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.133695 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f448c69d6-sjp2r" event={"ID":"3b177763-3020-4854-b45a-43d99221c670","Type":"ContainerStarted","Data":"a072d993c4c9b086f9b3fd34af075479c303033d57f5e867fb77f4b5a3a1001f"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.133990 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f448c69d6-sjp2r" event={"ID":"3b177763-3020-4854-b45a-43d99221c670","Type":"ContainerStarted","Data":"6a46c297738169de3ef3d1ba7f16f1ce96061244f5167c53d32ec12f7a944075"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.134004 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5f448c69d6-sjp2r" event={"ID":"3b177763-3020-4854-b45a-43d99221c670","Type":"ContainerStarted","Data":"dc99a332bef6104813b852ad966f46a1fbf12baa0af779fc62cb93beb67238d8"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.134174 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.134203 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.155246 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-8f88c9d47-m5rzn" event={"ID":"9355d502-bf01-4465-996d-483d99b92954","Type":"ContainerStarted","Data":"bb91bf9c5a54216cd959a646667850e4ce3f0d88eaac0af56180fdf9f8f472a6"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.171383 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5f448c69d6-sjp2r" podStartSLOduration=7.171355237 podStartE2EDuration="7.171355237s" podCreationTimestamp="2026-01-21 14:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:52.157294724 +0000 UTC m=+1170.066034676" watchObservedRunningTime="2026-01-21 14:48:52.171355237 +0000 UTC m=+1170.080095169" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.179748 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" event={"ID":"bb475766-6891-454b-8f7e-1494d9806891","Type":"ContainerStarted","Data":"2e37e880e1ded60bfde91a112ff6c6e3bdb214678f5b94a391c433bade1ec4b8"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.179794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" event={"ID":"bb475766-6891-454b-8f7e-1494d9806891","Type":"ContainerStarted","Data":"b481f179dc89b65a9e2d445c9f4cf87d2a628d4bc58ff169e717daab38105392"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.194634 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerStarted","Data":"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.204108 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-8f88c9d47-m5rzn" podStartSLOduration=2.781047323 podStartE2EDuration="13.204091762s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="2026-01-21 14:48:40.177979928 +0000 UTC m=+1158.086719860" lastFinishedPulling="2026-01-21 14:48:50.601024367 +0000 UTC m=+1168.509764299" observedRunningTime="2026-01-21 14:48:52.202500491 +0000 UTC m=+1170.111240453" watchObservedRunningTime="2026-01-21 14:48:52.204091762 +0000 UTC m=+1170.112831694" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218159 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerStarted","Data":"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218325 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" containerID="cri-o://5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218419 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218450 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" containerID="cri-o://222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218499 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" containerID="cri-o://5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.218543 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" containerID="cri-o://ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.233790 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerStarted","Data":"e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8"} Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.234005 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.234031 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" containerID="cri-o://e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.234142 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" containerID="cri-o://047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8" gracePeriod=30 Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.240370 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6898c4b994-dn9qn" podStartSLOduration=2.9121960380000003 podStartE2EDuration="13.240353348s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="2026-01-21 14:48:40.272802655 +0000 UTC m=+1158.181542587" lastFinishedPulling="2026-01-21 14:48:50.600959965 +0000 UTC m=+1168.509699897" observedRunningTime="2026-01-21 14:48:52.231196501 +0000 UTC m=+1170.139936433" watchObservedRunningTime="2026-01-21 14:48:52.240353348 +0000 UTC m=+1170.149093280" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.264417 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.244804978 podStartE2EDuration="59.264400488s" podCreationTimestamp="2026-01-21 14:47:53 +0000 UTC" firstStartedPulling="2026-01-21 14:47:54.688124521 +0000 UTC m=+1112.596864453" lastFinishedPulling="2026-01-21 14:48:50.707720031 +0000 UTC m=+1168.616459963" observedRunningTime="2026-01-21 14:48:52.256080114 +0000 UTC m=+1170.164820056" watchObservedRunningTime="2026-01-21 14:48:52.264400488 +0000 UTC m=+1170.173140420" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.284860 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=12.284840356 podStartE2EDuration="12.284840356s" podCreationTimestamp="2026-01-21 14:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:48:52.280894784 +0000 UTC m=+1170.189634726" watchObservedRunningTime="2026-01-21 14:48:52.284840356 +0000 UTC m=+1170.193580288" Jan 21 14:48:52 crc kubenswrapper[4720]: I0121 14:48:52.911023 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247379 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" exitCode=0 Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247411 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" exitCode=2 Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247448 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb"} Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.247489 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13"} Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.249396 4720 generic.go:334] "Generic (PLEG): container finished" podID="9647cb32-4c23-445c-a66b-c71439bf617d" containerID="047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8" exitCode=143 Jan 21 14:48:53 crc kubenswrapper[4720]: I0121 14:48:53.250149 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerDied","Data":"047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8"} Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.258564 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerStarted","Data":"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4"} Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.261744 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" exitCode=0 Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.261816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c"} Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.278753 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.399146176 podStartE2EDuration="15.27873446s" podCreationTimestamp="2026-01-21 14:48:39 +0000 UTC" firstStartedPulling="2026-01-21 14:48:41.708396077 +0000 UTC m=+1159.617136009" lastFinishedPulling="2026-01-21 14:48:50.587984361 +0000 UTC m=+1168.496724293" observedRunningTime="2026-01-21 14:48:54.275993519 +0000 UTC m=+1172.184733471" watchObservedRunningTime="2026-01-21 14:48:54.27873446 +0000 UTC m=+1172.187474402" Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.565405 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c8b4f85f7-4kz9x" Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.639811 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.640010 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f78c5dfcb-hsblf" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" containerID="cri-o://e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" gracePeriod=30 Jan 21 14:48:54 crc kubenswrapper[4720]: I0121 14:48:54.640203 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f78c5dfcb-hsblf" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" containerID="cri-o://3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" gracePeriod=30 Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.230305 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.275833 4720 generic.go:334] "Generic (PLEG): container finished" podID="eef8d65a-fa41-4368-8368-4b50935db576" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" exitCode=0 Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.276149 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerDied","Data":"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e"} Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285275 4720 generic.go:334] "Generic (PLEG): container finished" podID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" exitCode=0 Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285444 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285590 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23"} Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285636 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2","Type":"ContainerDied","Data":"97811cd0d01525b3eadfaadc0174563ab204a73195d95319ec50a029dadf2846"} Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.285677 4720 scope.go:117] "RemoveContainer" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289204 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289245 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289275 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289310 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289362 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289418 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289448 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") pod \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\" (UID: \"cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2\") " Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289704 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.289844 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.290361 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.308830 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts" (OuterVolumeSpecName: "scripts") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.342858 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j" (OuterVolumeSpecName: "kube-api-access-b6x4j") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "kube-api-access-b6x4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.379266 4720 scope.go:117] "RemoveContainer" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.379482 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.400632 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.400680 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6x4j\" (UniqueName: \"kubernetes.io/projected/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-kube-api-access-b6x4j\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.400691 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.411714 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.439624 4720 scope.go:117] "RemoveContainer" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.478190 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.482115 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-8648996d7d-4f2q4" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.485740 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.502308 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.502336 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.536950 4720 scope.go:117] "RemoveContainer" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.546905 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data" (OuterVolumeSpecName: "config-data") pod "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" (UID: "cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.580279 4720 scope.go:117] "RemoveContainer" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.584963 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb\": container with ID starting with 5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb not found: ID does not exist" containerID="5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585017 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb"} err="failed to get container status \"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb\": rpc error: code = NotFound desc = could not find container \"5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb\": container with ID starting with 5ea7ea8c41cb3182acf9d44e6f1a53a1e031ff9852935ca821816ea6bac8c0eb not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585060 4720 scope.go:117] "RemoveContainer" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.585410 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13\": container with ID starting with 222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13 not found: ID does not exist" containerID="222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585477 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13"} err="failed to get container status \"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13\": rpc error: code = NotFound desc = could not find container \"222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13\": container with ID starting with 222f35b26b0728beb62ec61698da4f41418c6e39b7da204a27ccfb0b40fb8d13 not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585495 4720 scope.go:117] "RemoveContainer" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.585761 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23\": container with ID starting with ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23 not found: ID does not exist" containerID="ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585825 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23"} err="failed to get container status \"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23\": rpc error: code = NotFound desc = could not find container \"ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23\": container with ID starting with ae2697d4de89a62c9158e02846068cdf4d641ca5c8dccf5ea2f2fd831333be23 not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.585848 4720 scope.go:117] "RemoveContainer" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.586117 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c\": container with ID starting with 5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c not found: ID does not exist" containerID="5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.586168 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c"} err="failed to get container status \"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c\": rpc error: code = NotFound desc = could not find container \"5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c\": container with ID starting with 5ce080112e1278b271b9eee49e4451f7cdf38f35f931e7ced0dfd13eaa4d909c not found: ID does not exist" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.604083 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.640867 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.646170 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.732248 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.732972 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.732991 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.733012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733019 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.733039 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733046 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: E0121 14:48:55.733084 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733091 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733356 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="sg-core" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733377 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-notification-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733401 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="ceilometer-central-agent" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.733421 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" containerName="proxy-httpd" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.736022 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.753596 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.753820 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.757248 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.833726 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.833931 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834014 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834094 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834190 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834273 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.834367 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.935845 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.935967 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.935984 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936049 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.936070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.937276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.937459 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.942585 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.942598 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.944016 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.954438 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:55 crc kubenswrapper[4720]: I0121 14:48:55.967376 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"ceilometer-0\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " pod="openstack/ceilometer-0" Jan 21 14:48:56 crc kubenswrapper[4720]: I0121 14:48:56.089782 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:48:56 crc kubenswrapper[4720]: I0121 14:48:56.642589 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:48:56 crc kubenswrapper[4720]: I0121 14:48:56.692212 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2" path="/var/lib/kubelet/pods/cb266c3f-6b50-4953-9f9f-9b41bfc3c4c2/volumes" Jan 21 14:48:57 crc kubenswrapper[4720]: I0121 14:48:57.311605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"3174443988b438c1b000ca3e584da6ddab213fd781467fefd432c56f8e7d99aa"} Jan 21 14:48:58 crc kubenswrapper[4720]: I0121 14:48:58.336986 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5"} Jan 21 14:48:58 crc kubenswrapper[4720]: I0121 14:48:58.337515 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a"} Jan 21 14:48:58 crc kubenswrapper[4720]: I0121 14:48:58.430420 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.154176 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5f448c69d6-sjp2r" Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.220223 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.220436 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" containerID="cri-o://0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" gracePeriod=30 Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.220830 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" containerID="cri-o://4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" gracePeriod=30 Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.354831 4720 generic.go:334] "Generic (PLEG): container finished" podID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" exitCode=143 Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.354892 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerDied","Data":"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be"} Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.361538 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae"} Jan 21 14:48:59 crc kubenswrapper[4720]: I0121 14:48:59.836925 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.808896 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.890799 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.891350 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" containerID="cri-o://7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410" gracePeriod=10 Jan 21 14:49:00 crc kubenswrapper[4720]: I0121 14:49:00.992244 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.008450 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: connect: connection refused" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.055806 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.384701 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.385060 4720 generic.go:334] "Generic (PLEG): container finished" podID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerID="7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410" exitCode=0 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.385109 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerDied","Data":"7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.395578 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerStarted","Data":"c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.396514 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419685 4720 generic.go:334] "Generic (PLEG): container finished" podID="eef8d65a-fa41-4368-8368-4b50935db576" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" exitCode=0 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419792 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f78c5dfcb-hsblf" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419850 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerDied","Data":"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419877 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f78c5dfcb-hsblf" event={"ID":"eef8d65a-fa41-4368-8368-4b50935db576","Type":"ContainerDied","Data":"68237d054d159a4765c532ce618148ffe9f3f4fde0b73c4c9dd1d07a506b6603"} Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.419920 4720 scope.go:117] "RemoveContainer" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.420308 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" containerID="cri-o://39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" gracePeriod=30 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.420456 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" containerID="cri-o://89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" gracePeriod=30 Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452226 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452529 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452672 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452706 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.452759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") pod \"eef8d65a-fa41-4368-8368-4b50935db576\" (UID: \"eef8d65a-fa41-4368-8368-4b50935db576\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.468174 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.009648277 podStartE2EDuration="6.46814408s" podCreationTimestamp="2026-01-21 14:48:55 +0000 UTC" firstStartedPulling="2026-01-21 14:48:56.654331218 +0000 UTC m=+1174.563071150" lastFinishedPulling="2026-01-21 14:49:00.112827021 +0000 UTC m=+1178.021566953" observedRunningTime="2026-01-21 14:49:01.45223347 +0000 UTC m=+1179.360973412" watchObservedRunningTime="2026-01-21 14:49:01.46814408 +0000 UTC m=+1179.376884012" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.481187 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh" (OuterVolumeSpecName: "kube-api-access-kjrlh") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "kube-api-access-kjrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.491029 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.556983 4720 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.557011 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjrlh\" (UniqueName: \"kubernetes.io/projected/eef8d65a-fa41-4368-8368-4b50935db576-kube-api-access-kjrlh\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.584200 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.608404 4720 scope.go:117] "RemoveContainer" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.611815 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.615732 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config" (OuterVolumeSpecName: "config") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.648826 4720 scope.go:117] "RemoveContainer" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" Jan 21 14:49:01 crc kubenswrapper[4720]: E0121 14:49:01.656432 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e\": container with ID starting with 3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e not found: ID does not exist" containerID="3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.656502 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e"} err="failed to get container status \"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e\": rpc error: code = NotFound desc = could not find container \"3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e\": container with ID starting with 3feae91b582c3edfa398af2c8e492be810876839abf0579e7a49dcb2911c2b6e not found: ID does not exist" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.656536 4720 scope.go:117] "RemoveContainer" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660013 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660086 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660119 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660168 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: E0121 14:49:01.660183 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3\": container with ID starting with e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3 not found: ID does not exist" containerID="e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660234 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") pod \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\" (UID: \"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01\") " Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660598 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660616 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.660225 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3"} err="failed to get container status \"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3\": rpc error: code = NotFound desc = could not find container \"e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3\": container with ID starting with e74c40ab383c4e2bd435cfe7f1033fab586921ebe963904a94ff3569cafdccd3 not found: ID does not exist" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.676009 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "eef8d65a-fa41-4368-8368-4b50935db576" (UID: "eef8d65a-fa41-4368-8368-4b50935db576"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.688899 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz" (OuterVolumeSpecName: "kube-api-access-lp7tz") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "kube-api-access-lp7tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.728199 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.728237 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.739322 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config" (OuterVolumeSpecName: "config") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.759353 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" (UID: "c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762344 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762373 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7tz\" (UniqueName: \"kubernetes.io/projected/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-kube-api-access-lp7tz\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762384 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762392 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762402 4720 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef8d65a-fa41-4368-8368-4b50935db576-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762409 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.762802 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:49:01 crc kubenswrapper[4720]: I0121 14:49:01.768638 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f78c5dfcb-hsblf"] Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.428016 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" event={"ID":"c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01","Type":"ContainerDied","Data":"63bb1ec50feeba6e58d30594b8e6156c8d4ea90e8cd1ba58353f3003db3dc734"} Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.428202 4720 scope.go:117] "RemoveContainer" containerID="7ae7c07ef8890756b398637a11b0756cd3d10ee29c05e003e54c7ef091337410" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.428074 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-99cnk" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.432962 4720 generic.go:334] "Generic (PLEG): container finished" podID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" exitCode=0 Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.433059 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerDied","Data":"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4"} Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.447147 4720 scope.go:117] "RemoveContainer" containerID="6bd3c4506512bee2427c44ab8e73cd801e736b9aa463cb2377da3847b6955208" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.492872 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.497469 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-99cnk"] Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.695355 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" path="/var/lib/kubelet/pods/c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01/volumes" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.696041 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef8d65a-fa41-4368-8368-4b50935db576" path="/var/lib/kubelet/pods/eef8d65a-fa41-4368-8368-4b50935db576/volumes" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.771728 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:51548->10.217.0.148:9311: read: connection reset by peer" Jan 21 14:49:02 crc kubenswrapper[4720]: I0121 14:49:02.771958 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7c6979b468-whx5j" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:51558->10.217.0.148:9311: read: connection reset by peer" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.044140 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096645 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096695 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096875 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.096897 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") pod \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\" (UID: \"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.099769 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.107853 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.107864 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct" (OuterVolumeSpecName: "kube-api-access-bn2ct") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "kube-api-access-bn2ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.107838 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts" (OuterVolumeSpecName: "scripts") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199434 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199496 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn2ct\" (UniqueName: \"kubernetes.io/projected/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-kube-api-access-bn2ct\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199512 4720 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.199547 4720 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.220785 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.285042 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data" (OuterVolumeSpecName: "config-data") pod "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" (UID: "1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.300732 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.300755 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.355472 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401803 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401845 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401869 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.401938 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.402006 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") pod \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\" (UID: \"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d\") " Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.402678 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs" (OuterVolumeSpecName: "logs") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.424532 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n" (OuterVolumeSpecName: "kube-api-access-hzc7n") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "kube-api-access-hzc7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.424885 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.445059 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448838 4720 generic.go:334] "Generic (PLEG): container finished" podID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" exitCode=0 Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448898 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerDied","Data":"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448923 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6979b468-whx5j" event={"ID":"f8e11fac-bcca-42cf-ae8e-1d118f47fc1d","Type":"ContainerDied","Data":"3d7a3ecb27979a3b44e08cb4019116ada1170edb766f69f4f67fae7e26496dfb"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.448939 4720 scope.go:117] "RemoveContainer" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.449024 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6979b468-whx5j" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.464965 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data" (OuterVolumeSpecName: "config-data") pod "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" (UID: "f8e11fac-bcca-42cf-ae8e-1d118f47fc1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466201 4720 generic.go:334] "Generic (PLEG): container finished" podID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" exitCode=0 Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466263 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerDied","Data":"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466273 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.466291 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e","Type":"ContainerDied","Data":"90a6eb13a4e9101f155273acf68ce13d955b44911e5a69b94d712c8ecc3ebd49"} Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511120 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511159 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511171 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzc7n\" (UniqueName: \"kubernetes.io/projected/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-kube-api-access-hzc7n\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511181 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.511192 4720 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.537189 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.537338 4720 scope.go:117] "RemoveContainer" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.554415 4720 scope.go:117] "RemoveContainer" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.554848 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7\": container with ID starting with 4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7 not found: ID does not exist" containerID="4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.554891 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7"} err="failed to get container status \"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7\": rpc error: code = NotFound desc = could not find container \"4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7\": container with ID starting with 4098a5820a89817a76b4d0b719ebfb677003556b981d87803c50c6b583d0e1d7 not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.554917 4720 scope.go:117] "RemoveContainer" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.555164 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be\": container with ID starting with 0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be not found: ID does not exist" containerID="0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.555193 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be"} err="failed to get container status \"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be\": rpc error: code = NotFound desc = could not find container \"0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be\": container with ID starting with 0ca615ad4570d56aa21daf5f2fa8602490f28f67d5858fadbe6bbdc9f51679be not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.555214 4720 scope.go:117] "RemoveContainer" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.555840 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566548 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566922 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566940 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566960 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566966 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566978 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566983 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.566994 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.566999 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567011 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567018 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567030 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567036 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567044 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="init" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567049 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="init" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.567058 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567063 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567213 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567227 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" containerName="barbican-api-log" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567238 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="probe" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567247 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-httpd" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567258 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef8d65a-fa41-4368-8368-4b50935db576" containerName="neutron-api" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567268 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" containerName="cinder-scheduler" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.567279 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d8ddeb-10f2-48b8-b8fe-e22112a5dd01" containerName="dnsmasq-dns" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.568129 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.571898 4720 scope.go:117] "RemoveContainer" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.572926 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.580892 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.615098 4720 scope.go:117] "RemoveContainer" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.615894 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4\": container with ID starting with 89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4 not found: ID does not exist" containerID="89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.615921 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4"} err="failed to get container status \"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4\": rpc error: code = NotFound desc = could not find container \"89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4\": container with ID starting with 89118313964a3db330a53f09f74c985ed214ae0d13b3b0a549e6e6f0d7d826d4 not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.615942 4720 scope.go:117] "RemoveContainer" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" Jan 21 14:49:03 crc kubenswrapper[4720]: E0121 14:49:03.616281 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35\": container with ID starting with 39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35 not found: ID does not exist" containerID="39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.616933 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35"} err="failed to get container status \"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35\": rpc error: code = NotFound desc = could not find container \"39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35\": container with ID starting with 39475d8695e8971ed2751b2136ab747e574ad565cce4b09314e177ccf6e17f35 not found: ID does not exist" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714719 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714764 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714811 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-scripts\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714861 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0896fa5e-6919-42bf-9e61-cf73218e9edf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714893 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcr6l\" (UniqueName: \"kubernetes.io/projected/0896fa5e-6919-42bf-9e61-cf73218e9edf-kube-api-access-fcr6l\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.714927 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.805219 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.809061 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7c6979b468-whx5j"] Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821514 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcr6l\" (UniqueName: \"kubernetes.io/projected/0896fa5e-6919-42bf-9e61-cf73218e9edf-kube-api-access-fcr6l\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821642 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821710 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-scripts\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821761 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0896fa5e-6919-42bf-9e61-cf73218e9edf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.821850 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0896fa5e-6919-42bf-9e61-cf73218e9edf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.830329 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-scripts\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.830772 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.832431 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-config-data\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.832790 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0896fa5e-6919-42bf-9e61-cf73218e9edf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.836477 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcr6l\" (UniqueName: \"kubernetes.io/projected/0896fa5e-6919-42bf-9e61-cf73218e9edf-kube-api-access-fcr6l\") pod \"cinder-scheduler-0\" (UID: \"0896fa5e-6919-42bf-9e61-cf73218e9edf\") " pod="openstack/cinder-scheduler-0" Jan 21 14:49:03 crc kubenswrapper[4720]: I0121 14:49:03.898214 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.393434 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.504559 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0896fa5e-6919-42bf-9e61-cf73218e9edf","Type":"ContainerStarted","Data":"886885cb6e867abef9b8b6b44ea76bb57f82465acc480ae208bd577e2f5e06f7"} Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.668196 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-69cc8766db-gdch7" Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.691083 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e" path="/var/lib/kubelet/pods/1c32ba08-0c9c-4f0a-b9f4-f56e91ee566e/volumes" Jan 21 14:49:04 crc kubenswrapper[4720]: I0121 14:49:04.692046 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e11fac-bcca-42cf-ae8e-1d118f47fc1d" path="/var/lib/kubelet/pods/f8e11fac-bcca-42cf-ae8e-1d118f47fc1d/volumes" Jan 21 14:49:05 crc kubenswrapper[4720]: I0121 14:49:05.521499 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0896fa5e-6919-42bf-9e61-cf73218e9edf","Type":"ContainerStarted","Data":"872a9c27150d862707f5a91175d8e8e15aaba3e73cf731e7a9533577a4dde8c1"} Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.251727 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.252708 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.257544 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.258486 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.258816 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-5w78j" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.262894 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.368981 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config-secret\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.369034 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.369080 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnljv\" (UniqueName: \"kubernetes.io/projected/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-kube-api-access-pnljv\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.369133 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.470623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config-secret\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.470947 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.471029 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnljv\" (UniqueName: \"kubernetes.io/projected/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-kube-api-access-pnljv\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.471111 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.471740 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.477618 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.484447 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-openstack-config-secret\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.491943 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnljv\" (UniqueName: \"kubernetes.io/projected/4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6-kube-api-access-pnljv\") pod \"openstackclient\" (UID: \"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6\") " pod="openstack/openstackclient" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.543301 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0896fa5e-6919-42bf-9e61-cf73218e9edf","Type":"ContainerStarted","Data":"9d964c86db80d0e6e48d9004a7e85b0f246592766bb6da83415075b51041b482"} Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.565093 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.565073282 podStartE2EDuration="3.565073282s" podCreationTimestamp="2026-01-21 14:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:06.565067712 +0000 UTC m=+1184.473807634" watchObservedRunningTime="2026-01-21 14:49:06.565073282 +0000 UTC m=+1184.473813214" Jan 21 14:49:06 crc kubenswrapper[4720]: I0121 14:49:06.573050 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 21 14:49:07 crc kubenswrapper[4720]: I0121 14:49:07.125250 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 21 14:49:07 crc kubenswrapper[4720]: W0121 14:49:07.134937 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bb447ec_c7a1_4d3b_bcb7_e05d5ead9fa6.slice/crio-bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180 WatchSource:0}: Error finding container bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180: Status 404 returned error can't find the container with id bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180 Jan 21 14:49:07 crc kubenswrapper[4720]: I0121 14:49:07.550347 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6","Type":"ContainerStarted","Data":"bbbf83e8bc8b953b796a2c68fe326accc6d585d3ee98f79c58dfb574f45e9180"} Jan 21 14:49:08 crc kubenswrapper[4720]: I0121 14:49:08.898598 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.163590 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.959936 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960384 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" containerID="cri-o://2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960507 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" containerID="cri-o://c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960544 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" containerID="cri-o://e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.960575 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" containerID="cri-o://b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5" gracePeriod=30 Jan 21 14:49:14 crc kubenswrapper[4720]: I0121 14:49:14.974213 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.153:3000/\": read tcp 10.217.0.2:49704->10.217.0.153:3000: read: connection reset by peer" Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631836 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2" exitCode=0 Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631880 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae" exitCode=2 Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631890 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a" exitCode=0 Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631898 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2"} Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631949 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae"} Jan 21 14:49:15 crc kubenswrapper[4720]: I0121 14:49:15.631965 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a"} Jan 21 14:49:17 crc kubenswrapper[4720]: I0121 14:49:17.655153 4720 generic.go:334] "Generic (PLEG): container finished" podID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerID="b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5" exitCode=0 Jan 21 14:49:17 crc kubenswrapper[4720]: I0121 14:49:17.655246 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5"} Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.140574 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270000 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270114 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270247 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270274 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270597 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270615 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.270985 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.271029 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.271394 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") pod \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\" (UID: \"750b936e-3a77-4d1a-abc8-94f4a64cb5f7\") " Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.272090 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.272116 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.274893 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq" (OuterVolumeSpecName: "kube-api-access-m7gxq") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "kube-api-access-m7gxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.275746 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts" (OuterVolumeSpecName: "scripts") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.296952 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.358855 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data" (OuterVolumeSpecName: "config-data") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.358865 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "750b936e-3a77-4d1a-abc8-94f4a64cb5f7" (UID: "750b936e-3a77-4d1a-abc8-94f4a64cb5f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373013 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373045 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7gxq\" (UniqueName: \"kubernetes.io/projected/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-kube-api-access-m7gxq\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373057 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373065 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.373073 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/750b936e-3a77-4d1a-abc8-94f4a64cb5f7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.664219 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6","Type":"ContainerStarted","Data":"782e00c2d9ebe5ff06d1034e3f65d84ed7840161d45f1ff7d564ebc20b056494"} Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.667243 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"750b936e-3a77-4d1a-abc8-94f4a64cb5f7","Type":"ContainerDied","Data":"3174443988b438c1b000ca3e584da6ddab213fd781467fefd432c56f8e7d99aa"} Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.667273 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.667320 4720 scope.go:117] "RemoveContainer" containerID="c0b245aa16f982abb708af6bfe73d8463cad116c5e48529117c9a2f4d4dd5ba2" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.687003 4720 scope.go:117] "RemoveContainer" containerID="e3ddf7580cec87f0fd50cb9869ba5c47ad980eb5e9ee4ffcde67a0271a42b7ae" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.697816 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.935553431 podStartE2EDuration="12.697797088s" podCreationTimestamp="2026-01-21 14:49:06 +0000 UTC" firstStartedPulling="2026-01-21 14:49:07.136920554 +0000 UTC m=+1185.045660486" lastFinishedPulling="2026-01-21 14:49:17.899164201 +0000 UTC m=+1195.807904143" observedRunningTime="2026-01-21 14:49:18.679170757 +0000 UTC m=+1196.587910700" watchObservedRunningTime="2026-01-21 14:49:18.697797088 +0000 UTC m=+1196.606537020" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.727399 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.751790 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.774743 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775162 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775185 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775205 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775214 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775231 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775239 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" Jan 21 14:49:18 crc kubenswrapper[4720]: E0121 14:49:18.775265 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775273 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775457 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-central-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775473 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="proxy-httpd" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775499 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="ceilometer-notification-agent" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.775508 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" containerName="sg-core" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.778235 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.778266 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.785756 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.785933 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.802967 4720 scope.go:117] "RemoveContainer" containerID="b4911b5a02877cc5031098b1df444df7accfc5e79babc4b70957cca0941831f5" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.836332 4720 scope.go:117] "RemoveContainer" containerID="2ca946f6124a12c5254e3038b9d09ccabdea2284acbabae1dd0c4eb8aece072a" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886639 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886733 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886768 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886797 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886832 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.886877 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988338 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988633 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988808 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.988884 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989033 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989270 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989418 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.989331 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.993798 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.995192 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.999032 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:18 crc kubenswrapper[4720]: I0121 14:49:18.999954 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.007782 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"ceilometer-0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " pod="openstack/ceilometer-0" Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.108366 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.608265 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:19 crc kubenswrapper[4720]: W0121 14:49:19.612098 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68f753ae_130b_46ab_a544_e694a81b09b0.slice/crio-168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9 WatchSource:0}: Error finding container 168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9: Status 404 returned error can't find the container with id 168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9 Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.679837 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9"} Jan 21 14:49:19 crc kubenswrapper[4720]: I0121 14:49:19.808242 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:20 crc kubenswrapper[4720]: I0121 14:49:20.689049 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="750b936e-3a77-4d1a-abc8-94f4a64cb5f7" path="/var/lib/kubelet/pods/750b936e-3a77-4d1a-abc8-94f4a64cb5f7/volumes" Jan 21 14:49:20 crc kubenswrapper[4720]: I0121 14:49:20.690260 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7"} Jan 21 14:49:21 crc kubenswrapper[4720]: I0121 14:49:21.696934 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2"} Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.712633 4720 generic.go:334] "Generic (PLEG): container finished" podID="9647cb32-4c23-445c-a66b-c71439bf617d" containerID="e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8" exitCode=137 Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.713177 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerDied","Data":"e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8"} Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.725830 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee"} Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.861360 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.879769 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:49:22 crc kubenswrapper[4720]: I0121 14:49:22.879816 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059370 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059432 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059550 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059575 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059706 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059772 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.059824 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") pod \"9647cb32-4c23-445c-a66b-c71439bf617d\" (UID: \"9647cb32-4c23-445c-a66b-c71439bf617d\") " Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.065789 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.066264 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs" (OuterVolumeSpecName: "logs") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.083767 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.092772 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4" (OuterVolumeSpecName: "kube-api-access-88pl4") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "kube-api-access-88pl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.102480 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts" (OuterVolumeSpecName: "scripts") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.135319 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.141977 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data" (OuterVolumeSpecName: "config-data") pod "9647cb32-4c23-445c-a66b-c71439bf617d" (UID: "9647cb32-4c23-445c-a66b-c71439bf617d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.162974 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163008 4720 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163017 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163025 4720 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9647cb32-4c23-445c-a66b-c71439bf617d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163033 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9647cb32-4c23-445c-a66b-c71439bf617d-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163041 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9647cb32-4c23-445c-a66b-c71439bf617d-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.163049 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88pl4\" (UniqueName: \"kubernetes.io/projected/9647cb32-4c23-445c-a66b-c71439bf617d-kube-api-access-88pl4\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736470 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerStarted","Data":"162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61"} Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736787 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" containerID="cri-o://fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736815 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736866 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" containerID="cri-o://162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736901 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" containerID="cri-o://7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.736931 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" containerID="cri-o://80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2" gracePeriod=30 Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.742702 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9647cb32-4c23-445c-a66b-c71439bf617d","Type":"ContainerDied","Data":"88ba151a7faa9c14ce1ffb6ea4af84a9781883a8fd8016a1bc953df79c7742c2"} Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.742754 4720 scope.go:117] "RemoveContainer" containerID="e645392ee717e99317e92da4dc3564263b8bac8f71e81e0d2dbbe2e76fae43d8" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.742755 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.773774 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037166814 podStartE2EDuration="5.773754037s" podCreationTimestamp="2026-01-21 14:49:18 +0000 UTC" firstStartedPulling="2026-01-21 14:49:19.61474523 +0000 UTC m=+1197.523485162" lastFinishedPulling="2026-01-21 14:49:23.351332463 +0000 UTC m=+1201.260072385" observedRunningTime="2026-01-21 14:49:23.764334945 +0000 UTC m=+1201.673074887" watchObservedRunningTime="2026-01-21 14:49:23.773754037 +0000 UTC m=+1201.682493969" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.789317 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.789471 4720 scope.go:117] "RemoveContainer" containerID="047ba4251d7b9546136638ee316dd3cee40c9a3f72ef38a8ab5feee0727fa5c8" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.807984 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827182 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: E0121 14:49:23.827621 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827644 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" Jan 21 14:49:23 crc kubenswrapper[4720]: E0121 14:49:23.827690 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827700 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827874 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.827906 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" containerName="cinder-api-log" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.828776 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.834240 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.834299 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.834614 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.842094 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973406 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4281fdf-eb56-41e8-a750-13ee7ac37bea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973483 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973528 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973571 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973603 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973891 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.973981 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-scripts\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.974016 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvsr9\" (UniqueName: \"kubernetes.io/projected/e4281fdf-eb56-41e8-a750-13ee7ac37bea-kube-api-access-rvsr9\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:23 crc kubenswrapper[4720]: I0121 14:49:23.974056 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4281fdf-eb56-41e8-a750-13ee7ac37bea-logs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075669 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075818 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-scripts\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075860 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvsr9\" (UniqueName: \"kubernetes.io/projected/e4281fdf-eb56-41e8-a750-13ee7ac37bea-kube-api-access-rvsr9\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075911 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4281fdf-eb56-41e8-a750-13ee7ac37bea-logs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.075976 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4281fdf-eb56-41e8-a750-13ee7ac37bea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076008 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076067 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076068 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e4281fdf-eb56-41e8-a750-13ee7ac37bea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076088 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076162 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.076394 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4281fdf-eb56-41e8-a750-13ee7ac37bea-logs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080071 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data-custom\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080191 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080517 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-scripts\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.080697 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.083362 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.087085 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4281fdf-eb56-41e8-a750-13ee7ac37bea-config-data\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.102201 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvsr9\" (UniqueName: \"kubernetes.io/projected/e4281fdf-eb56-41e8-a750-13ee7ac37bea-kube-api-access-rvsr9\") pod \"cinder-api-0\" (UID: \"e4281fdf-eb56-41e8-a750-13ee7ac37bea\") " pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.142582 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.672280 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.694155 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9647cb32-4c23-445c-a66b-c71439bf617d" path="/var/lib/kubelet/pods/9647cb32-4c23-445c-a66b-c71439bf617d/volumes" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761082 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61" exitCode=0 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761110 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee" exitCode=2 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761133 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2" exitCode=0 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761140 4720 generic.go:334] "Generic (PLEG): container finished" podID="68f753ae-130b-46ab-a544-e694a81b09b0" containerID="fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7" exitCode=0 Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761176 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761214 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761224 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761242 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"68f753ae-130b-46ab-a544-e694a81b09b0","Type":"ContainerDied","Data":"168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.761250 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="168dc9e1caf65495e6d22713c0516b4a76a2c6d8846fe3dd732420e7cb0959b9" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.762519 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4281fdf-eb56-41e8-a750-13ee7ac37bea","Type":"ContainerStarted","Data":"4b94f6b9b9b43b4ec34335af772443783caa0a6f26b0a54adcd5907d5d0118ce"} Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.789058 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887781 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887837 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887869 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887917 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887945 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.887979 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.888013 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") pod \"68f753ae-130b-46ab-a544-e694a81b09b0\" (UID: \"68f753ae-130b-46ab-a544-e694a81b09b0\") " Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.888261 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.888431 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.897840 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf" (OuterVolumeSpecName: "kube-api-access-tvmdf") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "kube-api-access-tvmdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.903048 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts" (OuterVolumeSpecName: "scripts") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.933113 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989701 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989733 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/68f753ae-130b-46ab-a544-e694a81b09b0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989748 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvmdf\" (UniqueName: \"kubernetes.io/projected/68f753ae-130b-46ab-a544-e694a81b09b0-kube-api-access-tvmdf\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989760 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.989771 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:24 crc kubenswrapper[4720]: I0121 14:49:24.992494 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.020929 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data" (OuterVolumeSpecName: "config-data") pod "68f753ae-130b-46ab-a544-e694a81b09b0" (UID: "68f753ae-130b-46ab-a544-e694a81b09b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.091907 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.091948 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f753ae-130b-46ab-a544-e694a81b09b0-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.655633 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656189 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656201 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656215 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656222 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656233 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656241 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" Jan 21 14:49:25 crc kubenswrapper[4720]: E0121 14:49:25.656252 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656259 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656404 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="proxy-httpd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656418 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-central-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656429 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="sg-core" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.656438 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" containerName="ceilometer-notification-agent" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.664016 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.674481 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.776414 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4281fdf-eb56-41e8-a750-13ee7ac37bea","Type":"ContainerStarted","Data":"9ca8e392ad7fbddfe43873c0128f07b8d0a7c9eb23029d723d3acbcb81226b55"} Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.776459 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.811045 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.812278 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.812323 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.824596 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.862541 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.882125 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.889835 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.890838 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.894066 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.894227 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.900125 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.904213 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.910132 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.913991 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914097 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914166 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914212 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914322 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914347 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914407 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914455 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914538 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914565 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914620 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.914642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.915917 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.984898 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"nova-api-db-create-c5zqd\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.992106 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:25 crc kubenswrapper[4720]: I0121 14:49:25.997828 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015301 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015337 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015358 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015372 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015401 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015440 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015467 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015503 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015518 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015540 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.015560 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.016070 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.017635 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.017906 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.018338 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.029648 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.034214 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.036543 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.042081 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.046443 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.056212 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"ceilometer-0\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.071459 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"nova-api-61ab-account-create-update-4mch7\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.075529 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.080197 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"nova-cell0-db-create-62k9x\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.132403 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.139701 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.158928 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.206073 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.211839 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.226010 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.226086 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.240012 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.329110 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.329178 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.329924 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.341094 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.342438 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.344702 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.373543 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.431235 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.431331 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.432349 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"nova-cell1-db-create-99kbn\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.485305 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.532969 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.533233 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.533917 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.587228 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"nova-cell0-d9b2-account-create-update-dld7b\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.646960 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.648453 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.657900 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.659500 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.751280 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f753ae-130b-46ab-a544-e694a81b09b0" path="/var/lib/kubelet/pods/68f753ae-130b-46ab-a544-e694a81b09b0/volumes" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.764383 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.849555 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.850192 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.964233 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.964312 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.965175 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:26 crc kubenswrapper[4720]: I0121 14:49:26.981580 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"nova-cell1-b472-account-create-update-cmqsp\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.041072 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.074223 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.084561 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.299752 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.323846 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:49:27 crc kubenswrapper[4720]: W0121 14:49:27.334771 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9cf579e_cb45_4984_8558_107b9576d977.slice/crio-313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9 WatchSource:0}: Error finding container 313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9: Status 404 returned error can't find the container with id 313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9 Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.351499 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.588426 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:49:27 crc kubenswrapper[4720]: W0121 14:49:27.714260 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad73ec2f_ba76_4451_8202_33403a41de12.slice/crio-9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd WatchSource:0}: Error finding container 9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd: Status 404 returned error can't find the container with id 9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.716692 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.835544 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerStarted","Data":"3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.835582 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerStarted","Data":"e28517e6ac3c028cc290c856ceb7d45cb3555aba34ea47c5d62a6fa80bb94aed"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.848139 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-99kbn" podStartSLOduration=1.84812102 podStartE2EDuration="1.84812102s" podCreationTimestamp="2026-01-21 14:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.847735641 +0000 UTC m=+1205.756475573" watchObservedRunningTime="2026-01-21 14:49:27.84812102 +0000 UTC m=+1205.756860952" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.853692 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerStarted","Data":"582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.853738 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerStarted","Data":"fe1617177061917c89b74484b60aa11dbd8cdee5da46170e9e019a43335bb85b"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.862308 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" event={"ID":"ad73ec2f-ba76-4451-8202-33403a41de12","Type":"ContainerStarted","Data":"9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.866393 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-62k9x" podStartSLOduration=2.866385072 podStartE2EDuration="2.866385072s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.865973512 +0000 UTC m=+1205.774713444" watchObservedRunningTime="2026-01-21 14:49:27.866385072 +0000 UTC m=+1205.775125004" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.876410 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerStarted","Data":"76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.876457 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerStarted","Data":"ba9e78a7f4a1867d0489a2a47d2924a5a5ff066089831878611ea21481537a78"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.889948 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e4281fdf-eb56-41e8-a750-13ee7ac37bea","Type":"ContainerStarted","Data":"a3d07e8f53a4709678b7f961476ba1888c41c2fe302f5ad0101a7fc048065db3"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.890613 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.904591 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" podStartSLOduration=1.904567248 podStartE2EDuration="1.904567248s" podCreationTimestamp="2026-01-21 14:49:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.902884225 +0000 UTC m=+1205.811624157" watchObservedRunningTime="2026-01-21 14:49:27.904567248 +0000 UTC m=+1205.813307180" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.909032 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"d769c49ed2fe68686c374a2f8612b148bed4023ed4696f58a59cd9bf88586865"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.916896 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerStarted","Data":"72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.916946 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerStarted","Data":"d184762624d55441d829d3557584647905074487e8ec82564ca9757072387f85"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.918776 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerStarted","Data":"640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.918795 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerStarted","Data":"313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9"} Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.926150 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.926134805 podStartE2EDuration="4.926134805s" podCreationTimestamp="2026-01-21 14:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.923327002 +0000 UTC m=+1205.832066964" watchObservedRunningTime="2026-01-21 14:49:27.926134805 +0000 UTC m=+1205.834874737" Jan 21 14:49:27 crc kubenswrapper[4720]: I0121 14:49:27.951835 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-c5zqd" podStartSLOduration=2.9518162869999998 podStartE2EDuration="2.951816287s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.940432894 +0000 UTC m=+1205.849172836" watchObservedRunningTime="2026-01-21 14:49:27.951816287 +0000 UTC m=+1205.860556219" Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.942035 4720 generic.go:334] "Generic (PLEG): container finished" podID="a08abcad-85f1-431b-853e-3599eebed756" containerID="582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.942130 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerDied","Data":"582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.944325 4720 generic.go:334] "Generic (PLEG): container finished" podID="ad73ec2f-ba76-4451-8202-33403a41de12" containerID="8c3eb39f9b9627b072a3900c90555cd68e5d7daab86658e513ca3c054e6b4044" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.944384 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" event={"ID":"ad73ec2f-ba76-4451-8202-33403a41de12","Type":"ContainerDied","Data":"8c3eb39f9b9627b072a3900c90555cd68e5d7daab86658e513ca3c054e6b4044"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.946553 4720 generic.go:334] "Generic (PLEG): container finished" podID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerID="76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.946609 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerDied","Data":"76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.948592 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.949640 4720 generic.go:334] "Generic (PLEG): container finished" podID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerID="72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.949694 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerDied","Data":"72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.951618 4720 generic.go:334] "Generic (PLEG): container finished" podID="d9cf579e-cb45-4984-8558-107b9576d977" containerID="640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.951702 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerDied","Data":"640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.953628 4720 generic.go:334] "Generic (PLEG): container finished" podID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerID="3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821" exitCode=0 Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.953743 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerDied","Data":"3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821"} Jan 21 14:49:28 crc kubenswrapper[4720]: I0121 14:49:28.958255 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-61ab-account-create-update-4mch7" podStartSLOduration=3.958239609 podStartE2EDuration="3.958239609s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:49:27.96120703 +0000 UTC m=+1205.869946972" watchObservedRunningTime="2026-01-21 14:49:28.958239609 +0000 UTC m=+1206.866979541" Jan 21 14:49:29 crc kubenswrapper[4720]: I0121 14:49:29.963617 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.976895 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-99kbn" event={"ID":"01f8146d-b3dd-48a4-b1a8-9fa590c0d808","Type":"ContainerDied","Data":"e28517e6ac3c028cc290c856ceb7d45cb3555aba34ea47c5d62a6fa80bb94aed"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.977198 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28517e6ac3c028cc290c856ceb7d45cb3555aba34ea47c5d62a6fa80bb94aed" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.980221 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-62k9x" event={"ID":"a08abcad-85f1-431b-853e-3599eebed756","Type":"ContainerDied","Data":"fe1617177061917c89b74484b60aa11dbd8cdee5da46170e9e019a43335bb85b"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.980252 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe1617177061917c89b74484b60aa11dbd8cdee5da46170e9e019a43335bb85b" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.981718 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" event={"ID":"ad73ec2f-ba76-4451-8202-33403a41de12","Type":"ContainerDied","Data":"9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.981739 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b391bb1b62fa1cdf1c4454a58d43c3f44f680eaa7e1eb55126bfb497cf706fd" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.983228 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" event={"ID":"a12f971e-bd5e-4b60-9d28-06c786d852ae","Type":"ContainerDied","Data":"ba9e78a7f4a1867d0489a2a47d2924a5a5ff066089831878611ea21481537a78"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.983245 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9e78a7f4a1867d0489a2a47d2924a5a5ff066089831878611ea21481537a78" Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.985142 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.988105 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-61ab-account-create-update-4mch7" event={"ID":"af31d5e0-11e6-433b-a31e-bea14d7e5c95","Type":"ContainerDied","Data":"d184762624d55441d829d3557584647905074487e8ec82564ca9757072387f85"} Jan 21 14:49:30 crc kubenswrapper[4720]: I0121 14:49:30.988147 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d184762624d55441d829d3557584647905074487e8ec82564ca9757072387f85" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.001789 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.008322 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.014300 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.020184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.028920 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.032042 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095171 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") pod \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095298 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") pod \"a12f971e-bd5e-4b60-9d28-06c786d852ae\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095324 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") pod \"a08abcad-85f1-431b-853e-3599eebed756\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095363 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") pod \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\" (UID: \"af31d5e0-11e6-433b-a31e-bea14d7e5c95\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095412 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") pod \"ad73ec2f-ba76-4451-8202-33403a41de12\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095456 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") pod \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095507 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") pod \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\" (UID: \"01f8146d-b3dd-48a4-b1a8-9fa590c0d808\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095553 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") pod \"d9cf579e-cb45-4984-8558-107b9576d977\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095580 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") pod \"a12f971e-bd5e-4b60-9d28-06c786d852ae\" (UID: \"a12f971e-bd5e-4b60-9d28-06c786d852ae\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095647 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") pod \"d9cf579e-cb45-4984-8558-107b9576d977\" (UID: \"d9cf579e-cb45-4984-8558-107b9576d977\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095680 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") pod \"ad73ec2f-ba76-4451-8202-33403a41de12\" (UID: \"ad73ec2f-ba76-4451-8202-33403a41de12\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.095698 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") pod \"a08abcad-85f1-431b-853e-3599eebed756\" (UID: \"a08abcad-85f1-431b-853e-3599eebed756\") " Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096074 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a12f971e-bd5e-4b60-9d28-06c786d852ae" (UID: "a12f971e-bd5e-4b60-9d28-06c786d852ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096157 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af31d5e0-11e6-433b-a31e-bea14d7e5c95" (UID: "af31d5e0-11e6-433b-a31e-bea14d7e5c95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096411 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a08abcad-85f1-431b-853e-3599eebed756" (UID: "a08abcad-85f1-431b-853e-3599eebed756"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.096685 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad73ec2f-ba76-4451-8202-33403a41de12" (UID: "ad73ec2f-ba76-4451-8202-33403a41de12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.097116 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9cf579e-cb45-4984-8558-107b9576d977" (UID: "d9cf579e-cb45-4984-8558-107b9576d977"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.097399 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "01f8146d-b3dd-48a4-b1a8-9fa590c0d808" (UID: "01f8146d-b3dd-48a4-b1a8-9fa590c0d808"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.103618 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt" (OuterVolumeSpecName: "kube-api-access-dn6nt") pod "d9cf579e-cb45-4984-8558-107b9576d977" (UID: "d9cf579e-cb45-4984-8558-107b9576d977"). InnerVolumeSpecName "kube-api-access-dn6nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.105218 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp" (OuterVolumeSpecName: "kube-api-access-n7zqp") pod "a08abcad-85f1-431b-853e-3599eebed756" (UID: "a08abcad-85f1-431b-853e-3599eebed756"). InnerVolumeSpecName "kube-api-access-n7zqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.105850 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82" (OuterVolumeSpecName: "kube-api-access-rrp82") pod "af31d5e0-11e6-433b-a31e-bea14d7e5c95" (UID: "af31d5e0-11e6-433b-a31e-bea14d7e5c95"). InnerVolumeSpecName "kube-api-access-rrp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.107218 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb" (OuterVolumeSpecName: "kube-api-access-8sndb") pod "a12f971e-bd5e-4b60-9d28-06c786d852ae" (UID: "a12f971e-bd5e-4b60-9d28-06c786d852ae"). InnerVolumeSpecName "kube-api-access-8sndb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.107266 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn" (OuterVolumeSpecName: "kube-api-access-ccgqn") pod "01f8146d-b3dd-48a4-b1a8-9fa590c0d808" (UID: "01f8146d-b3dd-48a4-b1a8-9fa590c0d808"). InnerVolumeSpecName "kube-api-access-ccgqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.109825 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66" (OuterVolumeSpecName: "kube-api-access-hvt66") pod "ad73ec2f-ba76-4451-8202-33403a41de12" (UID: "ad73ec2f-ba76-4451-8202-33403a41de12"). InnerVolumeSpecName "kube-api-access-hvt66". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197730 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrp82\" (UniqueName: \"kubernetes.io/projected/af31d5e0-11e6-433b-a31e-bea14d7e5c95-kube-api-access-rrp82\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197762 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a12f971e-bd5e-4b60-9d28-06c786d852ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197771 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a08abcad-85f1-431b-853e-3599eebed756-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197780 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af31d5e0-11e6-433b-a31e-bea14d7e5c95-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197791 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad73ec2f-ba76-4451-8202-33403a41de12-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197799 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccgqn\" (UniqueName: \"kubernetes.io/projected/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-kube-api-access-ccgqn\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197808 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01f8146d-b3dd-48a4-b1a8-9fa590c0d808-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197817 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dn6nt\" (UniqueName: \"kubernetes.io/projected/d9cf579e-cb45-4984-8558-107b9576d977-kube-api-access-dn6nt\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197825 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sndb\" (UniqueName: \"kubernetes.io/projected/a12f971e-bd5e-4b60-9d28-06c786d852ae-kube-api-access-8sndb\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197834 4720 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9cf579e-cb45-4984-8558-107b9576d977-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197842 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvt66\" (UniqueName: \"kubernetes.io/projected/ad73ec2f-ba76-4451-8202-33403a41de12-kube-api-access-hvt66\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:31 crc kubenswrapper[4720]: I0121 14:49:31.197850 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7zqp\" (UniqueName: \"kubernetes.io/projected/a08abcad-85f1-431b-853e-3599eebed756-kube-api-access-n7zqp\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005642 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-62k9x" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005668 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-c5zqd" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005691 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d9b2-account-create-update-dld7b" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005709 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-c5zqd" event={"ID":"d9cf579e-cb45-4984-8558-107b9576d977","Type":"ContainerDied","Data":"313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9"} Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005644 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b472-account-create-update-cmqsp" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005762 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-99kbn" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.006518 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-61ab-account-create-update-4mch7" Jan 21 14:49:32 crc kubenswrapper[4720]: I0121 14:49:32.005736 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313b82cca25744f795b0c71fee7f82ab9b0c99cd93e8e71d3469f6dadb27c2f9" Jan 21 14:49:33 crc kubenswrapper[4720]: I0121 14:49:33.016109 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerStarted","Data":"8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e"} Jan 21 14:49:33 crc kubenswrapper[4720]: I0121 14:49:33.016586 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:49:33 crc kubenswrapper[4720]: I0121 14:49:33.042519 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.692672333 podStartE2EDuration="8.042479547s" podCreationTimestamp="2026-01-21 14:49:25 +0000 UTC" firstStartedPulling="2026-01-21 14:49:27.396513422 +0000 UTC m=+1205.305253344" lastFinishedPulling="2026-01-21 14:49:31.746320626 +0000 UTC m=+1209.655060558" observedRunningTime="2026-01-21 14:49:33.042025236 +0000 UTC m=+1210.950765178" watchObservedRunningTime="2026-01-21 14:49:33.042479547 +0000 UTC m=+1210.951219499" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.347886 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.673766 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674148 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674166 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674178 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674185 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674196 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a08abcad-85f1-431b-853e-3599eebed756" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674202 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08abcad-85f1-431b-853e-3599eebed756" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674211 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cf579e-cb45-4984-8558-107b9576d977" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674217 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cf579e-cb45-4984-8558-107b9576d977" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674227 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674233 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: E0121 14:49:36.674250 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674255 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674411 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cf579e-cb45-4984-8558-107b9576d977" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674431 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674440 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674450 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674464 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a08abcad-85f1-431b-853e-3599eebed756" containerName="mariadb-database-create" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674478 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" containerName="mariadb-account-create-update" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.674988 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.681464 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.681754 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.681896 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pqrwq" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.692013 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701699 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701757 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.701886 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804070 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804145 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804204 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.804244 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.810727 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.810811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.823313 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:36 crc kubenswrapper[4720]: I0121 14:49:36.835999 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"nova-cell0-conductor-db-sync-vm954\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:37 crc kubenswrapper[4720]: I0121 14:49:37.000982 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:49:37 crc kubenswrapper[4720]: I0121 14:49:37.509146 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 14:49:38 crc kubenswrapper[4720]: I0121 14:49:38.065100 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerStarted","Data":"09ea94304001bcd335265fbc7a965a41b7b4d379c78f211cc4b029153d465e49"} Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.153958 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154603 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" containerID="cri-o://7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154707 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" containerID="cri-o://8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154736 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" containerID="cri-o://8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.154832 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" containerID="cri-o://1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb" gracePeriod=30 Jan 21 14:49:41 crc kubenswrapper[4720]: I0121 14:49:41.182647 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.159:3000/\": EOF" Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.123782 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e" exitCode=0 Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124081 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac" exitCode=2 Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124093 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79" exitCode=0 Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.123868 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e"} Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124131 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac"} Jan 21 14:49:42 crc kubenswrapper[4720]: I0121 14:49:42.124148 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79"} Jan 21 14:49:43 crc kubenswrapper[4720]: I0121 14:49:43.140427 4720 generic.go:334] "Generic (PLEG): container finished" podID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerID="1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb" exitCode=0 Jan 21 14:49:43 crc kubenswrapper[4720]: I0121 14:49:43.140489 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb"} Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.051503 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096250 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096294 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096380 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096417 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096460 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.096495 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") pod \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\" (UID: \"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f\") " Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.098216 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.098611 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.109368 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4" (OuterVolumeSpecName: "kube-api-access-pj7h4") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "kube-api-access-pj7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.113860 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts" (OuterVolumeSpecName: "scripts") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200018 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200296 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj7h4\" (UniqueName: \"kubernetes.io/projected/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-kube-api-access-pj7h4\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200366 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.200437 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.209521 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.229823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data" (OuterVolumeSpecName: "config-data") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.230223 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" (UID: "a45acaac-b9b3-4ca7-9aca-7c4e67ec145f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.237735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a45acaac-b9b3-4ca7-9aca-7c4e67ec145f","Type":"ContainerDied","Data":"d769c49ed2fe68686c374a2f8612b148bed4023ed4696f58a59cd9bf88586865"} Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.237778 4720 scope.go:117] "RemoveContainer" containerID="8b1da56d82bf0243680a9e8d3aa292d2a180e54b1f7678c12089ef984113676e" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.237780 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.270498 4720 scope.go:117] "RemoveContainer" containerID="8d16c2d9a5339da420c5910e4518fd27d97a69b6f265da84227c24dca04de7ac" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.297536 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.297818 4720 scope.go:117] "RemoveContainer" containerID="1364655a728c1356bdefce12e2d0e44573b775d14960ce98440245b79d150bcb" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.302280 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.302306 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.302315 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.322251 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.323012 4720 scope.go:117] "RemoveContainer" containerID="7f7e4c6d468593daf229306bc581366e165d475f87a2f54258d6febc1dbaad79" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.325863 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326197 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326213 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326222 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326229 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326237 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326244 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: E0121 14:49:50.326260 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326266 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326419 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="sg-core" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326428 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-central-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326440 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="ceilometer-notification-agent" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.326452 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" containerName="proxy-httpd" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.328247 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.337543 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.341846 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.357042 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404146 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404382 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404447 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404534 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404563 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.404647 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506423 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506760 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.506965 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507041 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507149 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507267 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.507820 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.509097 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.512688 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.513513 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.513592 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.513844 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.527716 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"ceilometer-0\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.665302 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:49:50 crc kubenswrapper[4720]: I0121 14:49:50.697222 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45acaac-b9b3-4ca7-9aca-7c4e67ec145f" path="/var/lib/kubelet/pods/a45acaac-b9b3-4ca7-9aca-7c4e67ec145f/volumes" Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.122033 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.133017 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.246027 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"cb3753944b96c83d03a1863c25acebd264308098d7a3d05463fb79438fec08af"} Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.247132 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerStarted","Data":"48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a"} Jan 21 14:49:51 crc kubenswrapper[4720]: I0121 14:49:51.278941 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-vm954" podStartSLOduration=2.729075439 podStartE2EDuration="15.278918645s" podCreationTimestamp="2026-01-21 14:49:36 +0000 UTC" firstStartedPulling="2026-01-21 14:49:37.548311389 +0000 UTC m=+1215.457051321" lastFinishedPulling="2026-01-21 14:49:50.098154595 +0000 UTC m=+1228.006894527" observedRunningTime="2026-01-21 14:49:51.271071481 +0000 UTC m=+1229.179811423" watchObservedRunningTime="2026-01-21 14:49:51.278918645 +0000 UTC m=+1229.187658587" Jan 21 14:49:52 crc kubenswrapper[4720]: I0121 14:49:52.880484 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:49:52 crc kubenswrapper[4720]: I0121 14:49:52.881064 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:49:53 crc kubenswrapper[4720]: I0121 14:49:53.262042 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d"} Jan 21 14:49:58 crc kubenswrapper[4720]: I0121 14:49:58.304670 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe"} Jan 21 14:49:58 crc kubenswrapper[4720]: I0121 14:49:58.305075 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2"} Jan 21 14:50:00 crc kubenswrapper[4720]: I0121 14:50:00.322487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerStarted","Data":"0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8"} Jan 21 14:50:00 crc kubenswrapper[4720]: I0121 14:50:00.322747 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:50:00 crc kubenswrapper[4720]: I0121 14:50:00.354185 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.105467285 podStartE2EDuration="10.354168908s" podCreationTimestamp="2026-01-21 14:49:50 +0000 UTC" firstStartedPulling="2026-01-21 14:49:51.132831901 +0000 UTC m=+1229.041571833" lastFinishedPulling="2026-01-21 14:49:59.381533524 +0000 UTC m=+1237.290273456" observedRunningTime="2026-01-21 14:50:00.353401293 +0000 UTC m=+1238.262141225" watchObservedRunningTime="2026-01-21 14:50:00.354168908 +0000 UTC m=+1238.262908840" Jan 21 14:50:07 crc kubenswrapper[4720]: I0121 14:50:07.400021 4720 generic.go:334] "Generic (PLEG): container finished" podID="4dda8050-939a-4a64-b119-b718b60c7887" containerID="48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a" exitCode=0 Jan 21 14:50:07 crc kubenswrapper[4720]: I0121 14:50:07.400109 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerDied","Data":"48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a"} Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.721920 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860025 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860062 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860167 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.860248 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") pod \"4dda8050-939a-4a64-b119-b718b60c7887\" (UID: \"4dda8050-939a-4a64-b119-b718b60c7887\") " Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.884490 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts" (OuterVolumeSpecName: "scripts") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.884797 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz" (OuterVolumeSpecName: "kube-api-access-59cmz") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "kube-api-access-59cmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.886038 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data" (OuterVolumeSpecName: "config-data") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.890833 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dda8050-939a-4a64-b119-b718b60c7887" (UID: "4dda8050-939a-4a64-b119-b718b60c7887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962008 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962055 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962069 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59cmz\" (UniqueName: \"kubernetes.io/projected/4dda8050-939a-4a64-b119-b718b60c7887-kube-api-access-59cmz\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:08 crc kubenswrapper[4720]: I0121 14:50:08.962082 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dda8050-939a-4a64-b119-b718b60c7887-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.421311 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-vm954" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.421287 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-vm954" event={"ID":"4dda8050-939a-4a64-b119-b718b60c7887","Type":"ContainerDied","Data":"09ea94304001bcd335265fbc7a965a41b7b4d379c78f211cc4b029153d465e49"} Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.421740 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09ea94304001bcd335265fbc7a965a41b7b4d379c78f211cc4b029153d465e49" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.560601 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 14:50:09 crc kubenswrapper[4720]: E0121 14:50:09.561022 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dda8050-939a-4a64-b119-b718b60c7887" containerName="nova-cell0-conductor-db-sync" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.561038 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dda8050-939a-4a64-b119-b718b60c7887" containerName="nova-cell0-conductor-db-sync" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.561272 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dda8050-939a-4a64-b119-b718b60c7887" containerName="nova-cell0-conductor-db-sync" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.561925 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.569427 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pqrwq" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.569548 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.572469 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.676766 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.676853 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.676897 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcn2h\" (UniqueName: \"kubernetes.io/projected/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-kube-api-access-lcn2h\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.778904 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.778979 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcn2h\" (UniqueName: \"kubernetes.io/projected/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-kube-api-access-lcn2h\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.779077 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.782431 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.782673 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.798667 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcn2h\" (UniqueName: \"kubernetes.io/projected/496cefe3-f97b-4d8c-9a25-4a6533d9e64c-kube-api-access-lcn2h\") pod \"nova-cell0-conductor-0\" (UID: \"496cefe3-f97b-4d8c-9a25-4a6533d9e64c\") " pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:09 crc kubenswrapper[4720]: I0121 14:50:09.879074 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:10 crc kubenswrapper[4720]: I0121 14:50:10.306555 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 21 14:50:10 crc kubenswrapper[4720]: I0121 14:50:10.431488 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"496cefe3-f97b-4d8c-9a25-4a6533d9e64c","Type":"ContainerStarted","Data":"ed3065c6ccdab478fef7ce2febafea92e13791b8f1eec5733513fca88ae4ab3e"} Jan 21 14:50:11 crc kubenswrapper[4720]: I0121 14:50:11.441555 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"496cefe3-f97b-4d8c-9a25-4a6533d9e64c","Type":"ContainerStarted","Data":"2a680d9aa6a57e0b34471343eed81fcf53b0f5f5c62294c89e943586c1975389"} Jan 21 14:50:11 crc kubenswrapper[4720]: I0121 14:50:11.442665 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:11 crc kubenswrapper[4720]: I0121 14:50:11.462065 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.462047108 podStartE2EDuration="2.462047108s" podCreationTimestamp="2026-01-21 14:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:11.454291095 +0000 UTC m=+1249.363031027" watchObservedRunningTime="2026-01-21 14:50:11.462047108 +0000 UTC m=+1249.370787030" Jan 21 14:50:19 crc kubenswrapper[4720]: I0121 14:50:19.905083 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.397780 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.398883 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.407147 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.409364 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.409579 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454777 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454852 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454888 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.454922 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556090 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556225 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556264 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.556294 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.575512 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.576723 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.586209 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.588180 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.589301 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.600131 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.611399 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.623621 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.625034 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"nova-cell0-cell-mapping-jcm9t\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.627091 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.633139 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658665 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658732 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658786 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658835 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658860 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658886 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.658912 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.677320 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.708618 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.710152 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.714243 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.730179 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.748096 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.758191 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781549 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781620 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781644 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781729 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781762 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781796 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781824 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781851 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781884 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781938 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.781964 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.786749 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.795761 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.795910 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.798865 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.801565 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.843165 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"nova-api-0\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " pod="openstack/nova-api-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.847269 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.848606 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.866273 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.874526 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"nova-cell1-novncproxy-0\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.886714 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.887792 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.887897 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.887932 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.888018 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.888512 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.894259 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.895322 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.916263 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.917822 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.953275 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.953345 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"nova-metadata-0\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " pod="openstack/nova-metadata-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989629 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989684 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989711 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989734 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.989760 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.990269 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.990313 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:20 crc kubenswrapper[4720]: I0121 14:50:20.990337 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.003261 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.014782 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.023810 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.091965 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092053 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092093 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092124 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092154 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092236 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.092354 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.095715 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.100546 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.102084 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.105202 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.109462 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.111308 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.114518 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"nova-scheduler-0\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.114957 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"dnsmasq-dns-566b5b7845-mbq5w\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.175802 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.238642 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.478029 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.569318 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerStarted","Data":"fd968c05b8bb02b11dbdb28cde7af3788cf08c2b3fd8fe15c592bc03162e9a7c"} Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.652232 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.795092 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.924007 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:21 crc kubenswrapper[4720]: I0121 14:50:21.962483 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.038169 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.039152 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.041976 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.043647 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.058897 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137026 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137090 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137148 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.137200 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.147079 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:50:22 crc kubenswrapper[4720]: W0121 14:50:22.150815 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f51fb54_b6cb_4a03_b378_714f549cd2a1.slice/crio-380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f WatchSource:0}: Error finding container 380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f: Status 404 returned error can't find the container with id 380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243084 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243581 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243707 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.243777 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.252445 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.255231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.265417 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.268742 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"nova-cell1-conductor-db-sync-wwmwq\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.412043 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.585587 4720 generic.go:334] "Generic (PLEG): container finished" podID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerID="c53b9b942e3700ab88cecf03857239f8c69e629fa546f404751ee79be8529e6b" exitCode=0 Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.585693 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerDied","Data":"c53b9b942e3700ab88cecf03857239f8c69e629fa546f404751ee79be8529e6b"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.585718 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerStarted","Data":"380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.589376 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerStarted","Data":"0cb0e309b39bce610a72ded1405c14599f22a6c20641e56fd95873ffd0658fca"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.593780 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerStarted","Data":"1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.596120 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerStarted","Data":"8d9596c402475afa19cd2db87682ff4b62a90d083e56dc241ca3ef4472369439"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.600415 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerStarted","Data":"288c85dff47a79ec2a1b499393b40b7854d1dcf0eb1a7514afc8487559facb57"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.607607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerStarted","Data":"3f3474f70de2d6017a446624474e6e64bbd65c26a6c2176f0176715419053ba3"} Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.649535 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jcm9t" podStartSLOduration=2.649510535 podStartE2EDuration="2.649510535s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:22.633208603 +0000 UTC m=+1260.541948545" watchObservedRunningTime="2026-01-21 14:50:22.649510535 +0000 UTC m=+1260.558250467" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.880112 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.880157 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.881722 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.882321 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:50:22 crc kubenswrapper[4720]: I0121 14:50:22.882366 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6" gracePeriod=600 Jan 21 14:50:23 crc kubenswrapper[4720]: I0121 14:50:23.047138 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 14:50:23 crc kubenswrapper[4720]: I0121 14:50:23.627202 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerStarted","Data":"2fb036b7bf2aa15771341a929f076f669e5c637e7af2b17fb8de16677e0b5e80"} Jan 21 14:50:23 crc kubenswrapper[4720]: E0121 14:50:23.834840 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1128ddd_06c2_4255_aa17_b62aa0f8a996.slice/crio-conmon-c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6.scope\": RecentStats: unable to find data in memory cache]" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.637493 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerStarted","Data":"ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d"} Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.639240 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.651863 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerStarted","Data":"0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c"} Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.657127 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6" exitCode=0 Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.657185 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6"} Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.657246 4720 scope.go:117] "RemoveContainer" containerID="533cdaf61eeca84a9c75ff12c4bc63c6833cac28437ed5151fede2f9b5a4f6a6" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.668362 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" podStartSLOduration=4.668344846 podStartE2EDuration="4.668344846s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:24.664967604 +0000 UTC m=+1262.573707546" watchObservedRunningTime="2026-01-21 14:50:24.668344846 +0000 UTC m=+1262.577084778" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.699833 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" podStartSLOduration=2.699815091 podStartE2EDuration="2.699815091s" podCreationTimestamp="2026-01-21 14:50:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:24.699323021 +0000 UTC m=+1262.608062953" watchObservedRunningTime="2026-01-21 14:50:24.699815091 +0000 UTC m=+1262.608555023" Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.723844 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:24 crc kubenswrapper[4720]: I0121 14:50:24.790195 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:50:27 crc kubenswrapper[4720]: I0121 14:50:27.584707 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:27 crc kubenswrapper[4720]: I0121 14:50:27.586703 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" containerID="cri-o://ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" gracePeriod=30 Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.614324 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.703821 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") pod \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\" (UID: \"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7\") " Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.721031 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h" (OuterVolumeSpecName: "kube-api-access-mcq8h") pod "ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" (UID: "ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7"). InnerVolumeSpecName "kube-api-access-mcq8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.756207 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f"} Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759481 4720 generic.go:334] "Generic (PLEG): container finished" podID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" exitCode=2 Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759519 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerDied","Data":"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92"} Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759541 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7","Type":"ContainerDied","Data":"b5de03c99a86e921243af3619119b73c952c5f3ccc688bb6fd4a69b6fda32dd9"} Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759555 4720 scope.go:117] "RemoveContainer" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.759645 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.800381 4720 scope.go:117] "RemoveContainer" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" Jan 21 14:50:28 crc kubenswrapper[4720]: E0121 14:50:28.803116 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92\": container with ID starting with ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92 not found: ID does not exist" containerID="ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.803153 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92"} err="failed to get container status \"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92\": rpc error: code = NotFound desc = could not find container \"ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92\": container with ID starting with ebd6d3199ec576dfbbe1f0b22d6431e8b4bb04c2a4b40f4a23c829c2a972ee92 not found: ID does not exist" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.805893 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcq8h\" (UniqueName: \"kubernetes.io/projected/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7-kube-api-access-mcq8h\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.845013 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.864215 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.872143 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: E0121 14:50:28.872463 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.872480 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.872644 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" containerName="kube-state-metrics" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.873250 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.880685 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.898871 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 21 14:50:28 crc kubenswrapper[4720]: I0121 14:50:28.899108 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010185 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010471 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps8n\" (UniqueName: \"kubernetes.io/projected/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-api-access-zps8n\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010575 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.010604 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.066436 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.066937 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" containerID="cri-o://09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.067006 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" containerID="cri-o://4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.067053 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" containerID="cri-o://2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.068033 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" containerID="cri-o://0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.111887 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.112153 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.112756 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.113026 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps8n\" (UniqueName: \"kubernetes.io/projected/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-api-access-zps8n\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.118626 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.119116 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.119933 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.133758 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps8n\" (UniqueName: \"kubernetes.io/projected/60d4c6e3-4a01-421e-aad1-1972ed16e528-kube-api-access-zps8n\") pod \"kube-state-metrics-0\" (UID: \"60d4c6e3-4a01-421e-aad1-1972ed16e528\") " pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.212560 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.770937 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerStarted","Data":"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.774988 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8" exitCode=0 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.775018 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe" exitCode=2 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.775075 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.775151 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.776753 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerStarted","Data":"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.776781 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerStarted","Data":"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.777775 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerStarted","Data":"43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.777902 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.788889 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerStarted","Data":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.788940 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerStarted","Data":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.789248 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" containerID="cri-o://625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.789409 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" containerID="cri-o://2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" gracePeriod=30 Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.800510 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.397837825 podStartE2EDuration="9.800494103s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.932876714 +0000 UTC m=+1259.841616646" lastFinishedPulling="2026-01-21 14:50:28.335532992 +0000 UTC m=+1266.244272924" observedRunningTime="2026-01-21 14:50:29.799982513 +0000 UTC m=+1267.708722455" watchObservedRunningTime="2026-01-21 14:50:29.800494103 +0000 UTC m=+1267.709234035" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.828497 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.172110635 podStartE2EDuration="9.8283568s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.673946988 +0000 UTC m=+1259.582686920" lastFinishedPulling="2026-01-21 14:50:28.330193153 +0000 UTC m=+1266.238933085" observedRunningTime="2026-01-21 14:50:29.820200369 +0000 UTC m=+1267.728940311" watchObservedRunningTime="2026-01-21 14:50:29.8283568 +0000 UTC m=+1267.737096732" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.874736 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.880284 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.492479152 podStartE2EDuration="9.880260004s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.961562346 +0000 UTC m=+1259.870302278" lastFinishedPulling="2026-01-21 14:50:28.349343198 +0000 UTC m=+1266.258083130" observedRunningTime="2026-01-21 14:50:29.848109797 +0000 UTC m=+1267.756849729" watchObservedRunningTime="2026-01-21 14:50:29.880260004 +0000 UTC m=+1267.788999936" Jan 21 14:50:29 crc kubenswrapper[4720]: I0121 14:50:29.897753 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.360201027 podStartE2EDuration="9.897734568s" podCreationTimestamp="2026-01-21 14:50:20 +0000 UTC" firstStartedPulling="2026-01-21 14:50:21.801460645 +0000 UTC m=+1259.710200577" lastFinishedPulling="2026-01-21 14:50:28.338994186 +0000 UTC m=+1266.247734118" observedRunningTime="2026-01-21 14:50:29.891154626 +0000 UTC m=+1267.799894558" watchObservedRunningTime="2026-01-21 14:50:29.897734568 +0000 UTC m=+1267.806474500" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.506468 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.592961 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593023 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593087 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593162 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") pod \"b4a03426-f037-45b9-8415-306cc3d2a735\" (UID: \"b4a03426-f037-45b9-8415-306cc3d2a735\") " Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593288 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs" (OuterVolumeSpecName: "logs") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.593555 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b4a03426-f037-45b9-8415-306cc3d2a735-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.608816 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw" (OuterVolumeSpecName: "kube-api-access-qtptw") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "kube-api-access-qtptw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.625192 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.635441 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data" (OuterVolumeSpecName: "config-data") pod "b4a03426-f037-45b9-8415-306cc3d2a735" (UID: "b4a03426-f037-45b9-8415-306cc3d2a735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.688017 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7" path="/var/lib/kubelet/pods/ad0da9a3-ceaf-4fad-bfe6-dd329a1b37f7/volumes" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.694875 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.694907 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4a03426-f037-45b9-8415-306cc3d2a735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.694920 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtptw\" (UniqueName: \"kubernetes.io/projected/b4a03426-f037-45b9-8415-306cc3d2a735-kube-api-access-qtptw\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.799256 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d" exitCode=0 Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.799317 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.801893 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"60d4c6e3-4a01-421e-aad1-1972ed16e528","Type":"ContainerStarted","Data":"b02a2e2c9834b0d63d9f335f9a3cc75d801cb2b6e47c24b7c8e1f9e825ba5396"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.802058 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"60d4c6e3-4a01-421e-aad1-1972ed16e528","Type":"ContainerStarted","Data":"ec0fe64aa799c048205ddaaddeeacf190263a67aeec33a00b9e7c524c15f979a"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.802420 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804278 4720 generic.go:334] "Generic (PLEG): container finished" podID="b4a03426-f037-45b9-8415-306cc3d2a735" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" exitCode=0 Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804309 4720 generic.go:334] "Generic (PLEG): container finished" podID="b4a03426-f037-45b9-8415-306cc3d2a735" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" exitCode=143 Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804333 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerDied","Data":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804358 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerDied","Data":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804379 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b4a03426-f037-45b9-8415-306cc3d2a735","Type":"ContainerDied","Data":"3f3474f70de2d6017a446624474e6e64bbd65c26a6c2176f0176715419053ba3"} Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804387 4720 scope.go:117] "RemoveContainer" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.804568 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.854044 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.471599029 podStartE2EDuration="2.853802163s" podCreationTimestamp="2026-01-21 14:50:28 +0000 UTC" firstStartedPulling="2026-01-21 14:50:29.857592033 +0000 UTC m=+1267.766331965" lastFinishedPulling="2026-01-21 14:50:30.239795157 +0000 UTC m=+1268.148535099" observedRunningTime="2026-01-21 14:50:30.848866492 +0000 UTC m=+1268.757606424" watchObservedRunningTime="2026-01-21 14:50:30.853802163 +0000 UTC m=+1268.762542115" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.890340 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.896086 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906216 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.906548 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906563 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.906625 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906633 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906789 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-metadata" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.906802 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" containerName="nova-metadata-log" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.907596 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.912485 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.913357 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.924987 4720 scope.go:117] "RemoveContainer" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.930037 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965291 4720 scope.go:117] "RemoveContainer" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.965750 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": container with ID starting with 2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615 not found: ID does not exist" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965778 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} err="failed to get container status \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": rpc error: code = NotFound desc = could not find container \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": container with ID starting with 2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615 not found: ID does not exist" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965798 4720 scope.go:117] "RemoveContainer" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: E0121 14:50:30.965978 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": container with ID starting with 625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b not found: ID does not exist" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.965999 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} err="failed to get container status \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": rpc error: code = NotFound desc = could not find container \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": container with ID starting with 625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b not found: ID does not exist" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.966012 4720 scope.go:117] "RemoveContainer" containerID="2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.966837 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615"} err="failed to get container status \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": rpc error: code = NotFound desc = could not find container \"2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615\": container with ID starting with 2179f70c0f1ed27cb4ffbaf9ad12b4230a9db29471ffdbbbcda8014c22b42615 not found: ID does not exist" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.966887 4720 scope.go:117] "RemoveContainer" containerID="625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b" Jan 21 14:50:30 crc kubenswrapper[4720]: I0121 14:50:30.967291 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b"} err="failed to get container status \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": rpc error: code = NotFound desc = could not find container \"625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b\": container with ID starting with 625aa9c2af709ec13f1a7d17354a7763d16e4128f3e89e12c9209faa9f02898b not found: ID does not exist" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:30.999894 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000060 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000099 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000136 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.000172 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.004004 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.017821 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.017966 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101830 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101881 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101910 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101936 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.101987 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.102779 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.106548 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.107259 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.111099 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.137861 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"nova-metadata-0\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.177354 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.177396 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.234935 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.239827 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.243273 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.362001 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.362443 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" containerID="cri-o://332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95" gracePeriod=10 Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.826664 4720 generic.go:334] "Generic (PLEG): container finished" podID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerID="332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95" exitCode=0 Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.826860 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerDied","Data":"332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95"} Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.886916 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 14:50:31 crc kubenswrapper[4720]: I0121 14:50:31.936430 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.062033 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.170:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.062623 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.170:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.069376 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136588 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136635 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136750 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136779 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.136893 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") pod \"2a0b57dc-517a-404a-a47d-1f86009fad51\" (UID: \"2a0b57dc-517a-404a-a47d-1f86009fad51\") " Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.159705 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8" (OuterVolumeSpecName: "kube-api-access-8bww8") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "kube-api-access-8bww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.241061 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bww8\" (UniqueName: \"kubernetes.io/projected/2a0b57dc-517a-404a-a47d-1f86009fad51-kube-api-access-8bww8\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.279173 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.284614 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.300170 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config" (OuterVolumeSpecName: "config") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.308971 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2a0b57dc-517a-404a-a47d-1f86009fad51" (UID: "2a0b57dc-517a-404a-a47d-1f86009fad51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342417 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342670 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342764 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.342851 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a0b57dc-517a-404a-a47d-1f86009fad51-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.713711 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a03426-f037-45b9-8415-306cc3d2a735" path="/var/lib/kubelet/pods/b4a03426-f037-45b9-8415-306cc3d2a735/volumes" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.835787 4720 generic.go:334] "Generic (PLEG): container finished" podID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerID="1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530" exitCode=0 Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.835935 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerDied","Data":"1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.838341 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerStarted","Data":"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.838506 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerStarted","Data":"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.838582 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerStarted","Data":"0c16d2b92ff3ecd28762fa538726ba17cdec4aab7f351b70cec6f1779e6f1d03"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.844382 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" event={"ID":"2a0b57dc-517a-404a-a47d-1f86009fad51","Type":"ContainerDied","Data":"c28c627632464fe77ab3019eb5addeb203e6d652bffef45ba63964d4aabbdd0c"} Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.844415 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-7f22v" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.844746 4720 scope.go:117] "RemoveContainer" containerID="332550904f3e433cd4d02f319dc6acd4e70218fd003f8a0d716e6b8b5738ed95" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.871566 4720 scope.go:117] "RemoveContainer" containerID="4d8b9a33cc2b4409a467cae14fe05fabf4e1586debbfc3178a4978e092725506" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.943854 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.943837246 podStartE2EDuration="2.943837246s" podCreationTimestamp="2026-01-21 14:50:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:32.921783046 +0000 UTC m=+1270.830522978" watchObservedRunningTime="2026-01-21 14:50:32.943837246 +0000 UTC m=+1270.852577178" Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.945541 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:50:32 crc kubenswrapper[4720]: I0121 14:50:32.974496 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-7f22v"] Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.857756 4720 generic.go:334] "Generic (PLEG): container finished" podID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerID="2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2" exitCode=0 Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.858006 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2"} Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.858204 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5dd7d19f-79e4-47c9-9934-cc003fe551db","Type":"ContainerDied","Data":"cb3753944b96c83d03a1863c25acebd264308098d7a3d05463fb79438fec08af"} Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.858223 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb3753944b96c83d03a1863c25acebd264308098d7a3d05463fb79438fec08af" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.942936 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973595 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973724 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973834 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973856 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973942 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.973989 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") pod \"5dd7d19f-79e4-47c9-9934-cc003fe551db\" (UID: \"5dd7d19f-79e4-47c9-9934-cc003fe551db\") " Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.980788 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.981560 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.992944 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts" (OuterVolumeSpecName: "scripts") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:33 crc kubenswrapper[4720]: I0121 14:50:33.994785 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5" (OuterVolumeSpecName: "kube-api-access-7ggb5") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "kube-api-access-7ggb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.020781 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075405 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075433 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ggb5\" (UniqueName: \"kubernetes.io/projected/5dd7d19f-79e4-47c9-9934-cc003fe551db-kube-api-access-7ggb5\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075442 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5dd7d19f-79e4-47c9-9934-cc003fe551db-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075450 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.075458 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.136911 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.159864 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data" (OuterVolumeSpecName: "config-data") pod "5dd7d19f-79e4-47c9-9934-cc003fe551db" (UID: "5dd7d19f-79e4-47c9-9934-cc003fe551db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.181725 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.181788 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dd7d19f-79e4-47c9-9934-cc003fe551db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.336986 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.385578 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.386009 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.386045 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.386112 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") pod \"b57a2637-15ee-4c59-881b-9364ffde9ffc\" (UID: \"b57a2637-15ee-4c59-881b-9364ffde9ffc\") " Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.401774 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts" (OuterVolumeSpecName: "scripts") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.408852 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg" (OuterVolumeSpecName: "kube-api-access-9w4hg") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "kube-api-access-9w4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.445497 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.445589 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data" (OuterVolumeSpecName: "config-data") pod "b57a2637-15ee-4c59-881b-9364ffde9ffc" (UID: "b57a2637-15ee-4c59-881b-9364ffde9ffc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489022 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489065 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w4hg\" (UniqueName: \"kubernetes.io/projected/b57a2637-15ee-4c59-881b-9364ffde9ffc-kube-api-access-9w4hg\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489084 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.489109 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b57a2637-15ee-4c59-881b-9364ffde9ffc-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.687522 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" path="/var/lib/kubelet/pods/2a0b57dc-517a-404a-a47d-1f86009fad51/volumes" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.868089 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.869072 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jcm9t" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.869487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jcm9t" event={"ID":"b57a2637-15ee-4c59-881b-9364ffde9ffc","Type":"ContainerDied","Data":"fd968c05b8bb02b11dbdb28cde7af3788cf08c2b3fd8fe15c592bc03162e9a7c"} Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.869529 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd968c05b8bb02b11dbdb28cde7af3788cf08c2b3fd8fe15c592bc03162e9a7c" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.894349 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.904602 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947389 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947903 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerName="nova-manage" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947928 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerName="nova-manage" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947942 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947952 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947964 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947970 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.947992 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.947999 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.948012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948020 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.948033 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948041 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: E0121 14:50:34.948057 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="init" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948065 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="init" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948252 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="proxy-httpd" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948268 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-central-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948277 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" containerName="nova-manage" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948287 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="sg-core" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948300 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" containerName="ceilometer-notification-agent" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.948310 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0b57dc-517a-404a-a47d-1f86009fad51" containerName="dnsmasq-dns" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.950145 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.954028 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.954248 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.954376 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.965719 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.995908 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996170 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996452 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996566 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996699 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996828 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:34 crc kubenswrapper[4720]: I0121 14:50:34.996946 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.062682 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.062918 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" containerID="cri-o://3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.063051 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" containerID="cri-o://1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.084823 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.085397 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" containerID="cri-o://aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098182 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098267 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098290 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098317 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098344 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098379 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098403 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098437 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.098836 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.100133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.122217 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.122519 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.123384 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.124499 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.125276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.130142 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.131267 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" containerID="cri-o://3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.130559 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" containerID="cri-o://ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" gracePeriod=30 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.155337 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"ceilometer-0\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.280283 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.796933 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.877525 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"095eb50a9535270990dd51b698c9cf80b5e404f52878168a52724d7ce2256d5b"} Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.879449 4720 generic.go:334] "Generic (PLEG): container finished" podID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" exitCode=143 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.879515 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerDied","Data":"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e"} Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.881584 4720 generic.go:334] "Generic (PLEG): container finished" podID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" exitCode=143 Jan 21 14:50:35 crc kubenswrapper[4720]: I0121 14:50:35.881633 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerDied","Data":"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1"} Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.179794 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.185528 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.187428 4720 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.187469 4720 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.243724 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.243804 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.688787 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd7d19f-79e4-47c9-9934-cc003fe551db" path="/var/lib/kubelet/pods/5dd7d19f-79e4-47c9-9934-cc003fe551db/volumes" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.797831 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839756 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839820 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839890 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.839960 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.840011 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") pod \"8b5b74e2-e979-488c-a3aa-cdb564e41206\" (UID: \"8b5b74e2-e979-488c-a3aa-cdb564e41206\") " Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.841092 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs" (OuterVolumeSpecName: "logs") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.860151 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s" (OuterVolumeSpecName: "kube-api-access-nh65s") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "kube-api-access-nh65s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.867392 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data" (OuterVolumeSpecName: "config-data") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.887618 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.905638 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917320 4720 generic.go:334] "Generic (PLEG): container finished" podID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" exitCode=0 Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917388 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerDied","Data":"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917542 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8b5b74e2-e979-488c-a3aa-cdb564e41206","Type":"ContainerDied","Data":"0c16d2b92ff3ecd28762fa538726ba17cdec4aab7f351b70cec6f1779e6f1d03"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.917567 4720 scope.go:117] "RemoveContainer" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.918464 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.936307 4720 generic.go:334] "Generic (PLEG): container finished" podID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerID="0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c" exitCode=0 Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.936634 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerDied","Data":"0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c"} Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942137 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b5b74e2-e979-488c-a3aa-cdb564e41206-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942164 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh65s\" (UniqueName: \"kubernetes.io/projected/8b5b74e2-e979-488c-a3aa-cdb564e41206-kube-api-access-nh65s\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942175 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.942186 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.960891 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8b5b74e2-e979-488c-a3aa-cdb564e41206" (UID: "8b5b74e2-e979-488c-a3aa-cdb564e41206"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.965769 4720 scope.go:117] "RemoveContainer" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.988568 4720 scope.go:117] "RemoveContainer" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.989360 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1\": container with ID starting with 3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1 not found: ID does not exist" containerID="3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.989404 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1"} err="failed to get container status \"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1\": rpc error: code = NotFound desc = could not find container \"3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1\": container with ID starting with 3f32fcb17c187f66d65615dc419ba0052c0cdf7ee2c73f5696c11ca220e567b1 not found: ID does not exist" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.989432 4720 scope.go:117] "RemoveContainer" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" Jan 21 14:50:36 crc kubenswrapper[4720]: E0121 14:50:36.990671 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e\": container with ID starting with ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e not found: ID does not exist" containerID="ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e" Jan 21 14:50:36 crc kubenswrapper[4720]: I0121 14:50:36.990695 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e"} err="failed to get container status \"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e\": rpc error: code = NotFound desc = could not find container \"ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e\": container with ID starting with ee550feb8e3a823073f178ad0f84667098adcf2f2a73bef1092302367c6a9e6e not found: ID does not exist" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.044605 4720 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b5b74e2-e979-488c-a3aa-cdb564e41206-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.261752 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.272022 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.281315 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: E0121 14:50:37.282227 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.282332 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" Jan 21 14:50:37 crc kubenswrapper[4720]: E0121 14:50:37.282407 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.282465 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.282936 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-log" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.283024 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" containerName="nova-metadata-metadata" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.286084 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.288885 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.289238 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.301515 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.352942 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") pod \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.353282 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") pod \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.354073 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") pod \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\" (UID: \"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6\") " Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356145 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356248 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356520 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356679 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.356825 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.377992 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.379482 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl" (OuterVolumeSpecName: "kube-api-access-fwpkl") pod "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" (UID: "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6"). InnerVolumeSpecName "kube-api-access-fwpkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.431095 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" (UID: "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.444332 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data" (OuterVolumeSpecName: "config-data") pod "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" (UID: "684b734f-b6c9-47c9-a8e6-696eb7b0e5d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461835 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461974 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462067 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462204 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462275 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.462401 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwpkl\" (UniqueName: \"kubernetes.io/projected/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6-kube-api-access-fwpkl\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.461840 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.466119 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.466594 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.468490 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.476026 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"nova-metadata-0\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.617711 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.953061 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab"} Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.955424 4720 generic.go:334] "Generic (PLEG): container finished" podID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" exitCode=0 Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.955907 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.958347 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerDied","Data":"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7"} Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.958399 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"684b734f-b6c9-47c9-a8e6-696eb7b0e5d6","Type":"ContainerDied","Data":"0cb0e309b39bce610a72ded1405c14599f22a6c20641e56fd95873ffd0658fca"} Jan 21 14:50:37 crc kubenswrapper[4720]: I0121 14:50:37.958415 4720 scope.go:117] "RemoveContainer" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.018876 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.050308 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.059984 4720 scope.go:117] "RemoveContainer" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.062774 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: E0121 14:50:38.063241 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.063255 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.063431 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" containerName="nova-scheduler-scheduler" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.064046 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: E0121 14:50:38.064609 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7\": container with ID starting with aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7 not found: ID does not exist" containerID="aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.064631 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7"} err="failed to get container status \"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7\": rpc error: code = NotFound desc = could not find container \"aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7\": container with ID starting with aa4eba24f81a8563f110a15ad5721600a8508111b6e91a07d196d385d0352fe7 not found: ID does not exist" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.069358 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.088097 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.176466 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.176569 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.176638 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.240236 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.277731 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.277831 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.277888 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.283204 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.285261 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.296408 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"nova-scheduler-0\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.310012 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.382728 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.382928 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.382958 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.383347 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") pod \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\" (UID: \"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7\") " Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.389852 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts" (OuterVolumeSpecName: "scripts") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.389920 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw" (OuterVolumeSpecName: "kube-api-access-62smw") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "kube-api-access-62smw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.393915 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.428174 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.430850 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data" (OuterVolumeSpecName: "config-data") pod "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" (UID: "6f23517c-a9a1-4740-8b3b-d42b40cc8bc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485813 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485853 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62smw\" (UniqueName: \"kubernetes.io/projected/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-kube-api-access-62smw\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485865 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.485873 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.697123 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="684b734f-b6c9-47c9-a8e6-696eb7b0e5d6" path="/var/lib/kubelet/pods/684b734f-b6c9-47c9-a8e6-696eb7b0e5d6/volumes" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.698280 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5b74e2-e979-488c-a3aa-cdb564e41206" path="/var/lib/kubelet/pods/8b5b74e2-e979-488c-a3aa-cdb564e41206/volumes" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.880852 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.976100 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:38 crc kubenswrapper[4720]: I0121 14:50:38.994196 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014354 4720 generic.go:334] "Generic (PLEG): container finished" podID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" exitCode=0 Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014451 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerDied","Data":"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014492 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"513a3b4c-405a-4045-a76b-acf59f0cfd3a","Type":"ContainerDied","Data":"8d9596c402475afa19cd2db87682ff4b62a90d083e56dc241ca3ef4472369439"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014528 4720 scope.go:117] "RemoveContainer" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.014625 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.053607 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" event={"ID":"6f23517c-a9a1-4740-8b3b-d42b40cc8bc7","Type":"ContainerDied","Data":"2fb036b7bf2aa15771341a929f076f669e5c637e7af2b17fb8de16677e0b5e80"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.053644 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fb036b7bf2aa15771341a929f076f669e5c637e7af2b17fb8de16677e0b5e80" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.053719 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwmwq" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.054506 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.055133 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055283 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.055319 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerName="nova-cell1-conductor-db-sync" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055329 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerName="nova-cell1-conductor-db-sync" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.055347 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055356 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055938 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-api" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055977 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" containerName="nova-api-log" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.055997 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" containerName="nova-cell1-conductor-db-sync" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.060706 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.061612 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerStarted","Data":"6a4188e9bbe7707a1cbd5fc7c33ecb7166835f18ea14b69fcb7fc8e351f09029"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.062206 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.066249 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.070930 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerStarted","Data":"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.070971 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerStarted","Data":"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.070980 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerStarted","Data":"9bd73af5fd59322a2bd5b4dadb3b5852cd6bfb2cf195e8e11949965c74ef70f1"} Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.104523 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.104923 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105012 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105269 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") pod \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\" (UID: \"513a3b4c-405a-4045-a76b-acf59f0cfd3a\") " Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105707 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105763 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrbcd\" (UniqueName: \"kubernetes.io/projected/679bb64e-c157-415f-9214-0f4e62001f03-kube-api-access-jrbcd\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.105829 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.111829 4720 scope.go:117] "RemoveContainer" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.112404 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs" (OuterVolumeSpecName: "logs") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.119000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v" (OuterVolumeSpecName: "kube-api-access-flf6v") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "kube-api-access-flf6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.124391 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.124364651 podStartE2EDuration="2.124364651s" podCreationTimestamp="2026-01-21 14:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:39.117630046 +0000 UTC m=+1277.026369988" watchObservedRunningTime="2026-01-21 14:50:39.124364651 +0000 UTC m=+1277.033104583" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.151570 4720 scope.go:117] "RemoveContainer" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.152104 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45\": container with ID starting with 1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45 not found: ID does not exist" containerID="1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.152150 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45"} err="failed to get container status \"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45\": rpc error: code = NotFound desc = could not find container \"1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45\": container with ID starting with 1a7aba18286d833bce25e8e85b8d4351c6d249b6433918eb9ba77d5c982b9b45 not found: ID does not exist" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.152170 4720 scope.go:117] "RemoveContainer" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" Jan 21 14:50:39 crc kubenswrapper[4720]: E0121 14:50:39.152409 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1\": container with ID starting with 3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1 not found: ID does not exist" containerID="3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.152427 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1"} err="failed to get container status \"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1\": rpc error: code = NotFound desc = could not find container \"3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1\": container with ID starting with 3e9948de7472ed8920759f420aefe5ec7761ed425b16da329167d68730b928c1 not found: ID does not exist" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.173771 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data" (OuterVolumeSpecName: "config-data") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.180845 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "513a3b4c-405a-4045-a76b-acf59f0cfd3a" (UID: "513a3b4c-405a-4045-a76b-acf59f0cfd3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.207929 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.207994 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrbcd\" (UniqueName: \"kubernetes.io/projected/679bb64e-c157-415f-9214-0f4e62001f03-kube-api-access-jrbcd\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208050 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208199 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208215 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flf6v\" (UniqueName: \"kubernetes.io/projected/513a3b4c-405a-4045-a76b-acf59f0cfd3a-kube-api-access-flf6v\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208228 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/513a3b4c-405a-4045-a76b-acf59f0cfd3a-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.208261 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/513a3b4c-405a-4045-a76b-acf59f0cfd3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.211784 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.217226 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/679bb64e-c157-415f-9214-0f4e62001f03-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.224063 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrbcd\" (UniqueName: \"kubernetes.io/projected/679bb64e-c157-415f-9214-0f4e62001f03-kube-api-access-jrbcd\") pod \"nova-cell1-conductor-0\" (UID: \"679bb64e-c157-415f-9214-0f4e62001f03\") " pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.249756 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.398535 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.411152 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.461425 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.485996 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.487608 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.490140 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.512197 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642574 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642642 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642677 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.642715 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745742 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745899 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745964 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.745991 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.747162 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.755536 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.766138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.769510 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"nova-api-0\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.812780 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:50:39 crc kubenswrapper[4720]: I0121 14:50:39.926256 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.084558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerStarted","Data":"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f"} Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.093239 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerStarted","Data":"e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1"} Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.093865 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.095706 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"679bb64e-c157-415f-9214-0f4e62001f03","Type":"ContainerStarted","Data":"d2faa52133bae0219c5f601518ddac5869d940e38bedf4585888fa7d17866164"} Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.111398 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.111376061 podStartE2EDuration="3.111376061s" podCreationTimestamp="2026-01-21 14:50:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:40.101227463 +0000 UTC m=+1278.009967405" watchObservedRunningTime="2026-01-21 14:50:40.111376061 +0000 UTC m=+1278.020115993" Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.161613 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.732283312 podStartE2EDuration="6.161596413s" podCreationTimestamp="2026-01-21 14:50:34 +0000 UTC" firstStartedPulling="2026-01-21 14:50:35.805870437 +0000 UTC m=+1273.714610379" lastFinishedPulling="2026-01-21 14:50:39.235183548 +0000 UTC m=+1277.143923480" observedRunningTime="2026-01-21 14:50:40.121483688 +0000 UTC m=+1278.030223620" watchObservedRunningTime="2026-01-21 14:50:40.161596413 +0000 UTC m=+1278.070336345" Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.309931 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:50:40 crc kubenswrapper[4720]: I0121 14:50:40.687980 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="513a3b4c-405a-4045-a76b-acf59f0cfd3a" path="/var/lib/kubelet/pods/513a3b4c-405a-4045-a76b-acf59f0cfd3a/volumes" Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.109760 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerStarted","Data":"58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.109816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerStarted","Data":"970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.109832 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerStarted","Data":"1ac2709f0fcedf3f81608ca1a0f69ad5080c7f047fe98d3ffad1aa7ecce36ad0"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.111956 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"679bb64e-c157-415f-9214-0f4e62001f03","Type":"ContainerStarted","Data":"dac570b967201ec13bde624a0f28edde829ebf2e73a8c0ad20002188376c9b7b"} Jan 21 14:50:41 crc kubenswrapper[4720]: I0121 14:50:41.130774 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.130754691 podStartE2EDuration="2.130754691s" podCreationTimestamp="2026-01-21 14:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:41.125707787 +0000 UTC m=+1279.034447729" watchObservedRunningTime="2026-01-21 14:50:41.130754691 +0000 UTC m=+1279.039494623" Jan 21 14:50:42 crc kubenswrapper[4720]: I0121 14:50:42.119593 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:42 crc kubenswrapper[4720]: I0121 14:50:42.617818 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:42 crc kubenswrapper[4720]: I0121 14:50:42.617871 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.394300 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.964895 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=4.96463424 podStartE2EDuration="4.96463424s" podCreationTimestamp="2026-01-21 14:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:50:41.158101399 +0000 UTC m=+1279.066841331" watchObservedRunningTime="2026-01-21 14:50:43.96463424 +0000 UTC m=+1281.873374202" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.965227 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.967076 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:43 crc kubenswrapper[4720]: I0121 14:50:43.985354 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.035968 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.036181 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.036284 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.138822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.138905 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.138938 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.139419 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.139803 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.159460 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"redhat-operators-prdsm\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.298626 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:44 crc kubenswrapper[4720]: I0121 14:50:44.781339 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:50:45 crc kubenswrapper[4720]: I0121 14:50:45.144609 4720 generic.go:334] "Generic (PLEG): container finished" podID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" exitCode=0 Jan 21 14:50:45 crc kubenswrapper[4720]: I0121 14:50:45.144781 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414"} Jan 21 14:50:45 crc kubenswrapper[4720]: I0121 14:50:45.145019 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerStarted","Data":"abc55b9d285c58116da7b148c4e87092b6925a74a8b7d4e6ff71e53eb61cdc76"} Jan 21 14:50:47 crc kubenswrapper[4720]: I0121 14:50:47.162876 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerStarted","Data":"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab"} Jan 21 14:50:47 crc kubenswrapper[4720]: I0121 14:50:47.618325 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:50:47 crc kubenswrapper[4720]: I0121 14:50:47.618373 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.394992 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.418589 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.631899 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:48 crc kubenswrapper[4720]: I0121 14:50:48.631931 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.206882 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.440460 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.813675 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:49 crc kubenswrapper[4720]: I0121 14:50:49.813720 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:50:50 crc kubenswrapper[4720]: I0121 14:50:50.895886 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:50 crc kubenswrapper[4720]: I0121 14:50:50.895887 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:50:51 crc kubenswrapper[4720]: I0121 14:50:51.197257 4720 generic.go:334] "Generic (PLEG): container finished" podID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" exitCode=0 Jan 21 14:50:51 crc kubenswrapper[4720]: I0121 14:50:51.197329 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab"} Jan 21 14:50:53 crc kubenswrapper[4720]: I0121 14:50:53.223780 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerStarted","Data":"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3"} Jan 21 14:50:53 crc kubenswrapper[4720]: I0121 14:50:53.246871 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prdsm" podStartSLOduration=2.98547359 podStartE2EDuration="10.246851185s" podCreationTimestamp="2026-01-21 14:50:43 +0000 UTC" firstStartedPulling="2026-01-21 14:50:45.146444516 +0000 UTC m=+1283.055184448" lastFinishedPulling="2026-01-21 14:50:52.407822071 +0000 UTC m=+1290.316562043" observedRunningTime="2026-01-21 14:50:53.245864126 +0000 UTC m=+1291.154604098" watchObservedRunningTime="2026-01-21 14:50:53.246851185 +0000 UTC m=+1291.155591127" Jan 21 14:50:54 crc kubenswrapper[4720]: I0121 14:50:54.300247 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:54 crc kubenswrapper[4720]: I0121 14:50:54.301063 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:50:55 crc kubenswrapper[4720]: I0121 14:50:55.355280 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prdsm" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" probeResult="failure" output=< Jan 21 14:50:55 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:50:55 crc kubenswrapper[4720]: > Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.635934 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.636332 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.680207 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:50:57 crc kubenswrapper[4720]: I0121 14:50:57.682009 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.816931 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.817215 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.817492 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.818101 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.829832 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:50:59 crc kubenswrapper[4720]: I0121 14:50:59.835287 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.061495 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.063640 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.083667 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.144802 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.144837 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.144903 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.145017 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.145050 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246545 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246645 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246676 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.246741 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.247711 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.247758 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.248367 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.248481 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.265910 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"dnsmasq-dns-5b856c5697-9p6zm\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.288055 4720 generic.go:334] "Generic (PLEG): container finished" podID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerID="43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437" exitCode=137 Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.289794 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerDied","Data":"43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437"} Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.473393 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.627174 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.759155 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") pod \"05605fa3-fac7-4375-8a3b-ff90d2664098\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.759219 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") pod \"05605fa3-fac7-4375-8a3b-ff90d2664098\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.759301 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") pod \"05605fa3-fac7-4375-8a3b-ff90d2664098\" (UID: \"05605fa3-fac7-4375-8a3b-ff90d2664098\") " Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.773033 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l" (OuterVolumeSpecName: "kube-api-access-tpb4l") pod "05605fa3-fac7-4375-8a3b-ff90d2664098" (UID: "05605fa3-fac7-4375-8a3b-ff90d2664098"). InnerVolumeSpecName "kube-api-access-tpb4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.794490 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05605fa3-fac7-4375-8a3b-ff90d2664098" (UID: "05605fa3-fac7-4375-8a3b-ff90d2664098"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.819000 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data" (OuterVolumeSpecName: "config-data") pod "05605fa3-fac7-4375-8a3b-ff90d2664098" (UID: "05605fa3-fac7-4375-8a3b-ff90d2664098"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.860914 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpb4l\" (UniqueName: \"kubernetes.io/projected/05605fa3-fac7-4375-8a3b-ff90d2664098-kube-api-access-tpb4l\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.861162 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:00 crc kubenswrapper[4720]: I0121 14:51:00.861253 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05605fa3-fac7-4375-8a3b-ff90d2664098-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.048758 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.297235 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.298346 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05605fa3-fac7-4375-8a3b-ff90d2664098","Type":"ContainerDied","Data":"288c85dff47a79ec2a1b499393b40b7854d1dcf0eb1a7514afc8487559facb57"} Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.298411 4720 scope.go:117] "RemoveContainer" containerID="43256d114f7b72d2ce26d562115a7a1fc28bb5530d0b1203ac1a95fee0c62437" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.300685 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerStarted","Data":"62714c9a0f1a3b425c21ab81569bf9c4c0ba1448aea15537467fba81fe36bdf5"} Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.334231 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.339446 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.390422 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: E0121 14:51:01.391731 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.391789 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.400883 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" containerName="nova-cell1-novncproxy-novncproxy" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.402184 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.404819 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.404853 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.405154 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.410265 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477350 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477403 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477463 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngm4w\" (UniqueName: \"kubernetes.io/projected/5ea3e3dd-0e39-4a28-9112-27f0874af221-kube-api-access-ngm4w\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477494 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.477557 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.579722 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580047 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580242 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngm4w\" (UniqueName: \"kubernetes.io/projected/5ea3e3dd-0e39-4a28-9112-27f0874af221-kube-api-access-ngm4w\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580335 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.580490 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.585089 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.585240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.585622 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.592185 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ea3e3dd-0e39-4a28-9112-27f0874af221-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.600115 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngm4w\" (UniqueName: \"kubernetes.io/projected/5ea3e3dd-0e39-4a28-9112-27f0874af221-kube-api-access-ngm4w\") pod \"nova-cell1-novncproxy-0\" (UID: \"5ea3e3dd-0e39-4a28-9112-27f0874af221\") " pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:01 crc kubenswrapper[4720]: I0121 14:51:01.720175 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.181783 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.314605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ea3e3dd-0e39-4a28-9112-27f0874af221","Type":"ContainerStarted","Data":"c656f4a85b4359ec9b810d9745ead588aa464c8e0e2def306f5ade53cbe97c15"} Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.693314 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05605fa3-fac7-4375-8a3b-ff90d2664098" path="/var/lib/kubelet/pods/05605fa3-fac7-4375-8a3b-ff90d2664098/volumes" Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.780999 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.781261 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" containerID="cri-o://970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750" gracePeriod=30 Jan 21 14:51:02 crc kubenswrapper[4720]: I0121 14:51:02.781330 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" containerID="cri-o://58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778" gracePeriod=30 Jan 21 14:51:03 crc kubenswrapper[4720]: I0121 14:51:03.324162 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerStarted","Data":"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.336599 4720 generic.go:334] "Generic (PLEG): container finished" podID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerID="970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750" exitCode=143 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.336713 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerDied","Data":"970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.340585 4720 generic.go:334] "Generic (PLEG): container finished" podID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" exitCode=0 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.340782 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerDied","Data":"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.342730 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5ea3e3dd-0e39-4a28-9112-27f0874af221","Type":"ContainerStarted","Data":"2f03ec3393a2878809a279f895ea20586e7f009c9c39727ea085c5a0d12d7584"} Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.403300 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.403279987 podStartE2EDuration="3.403279987s" podCreationTimestamp="2026-01-21 14:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:04.389756745 +0000 UTC m=+1302.298496677" watchObservedRunningTime="2026-01-21 14:51:04.403279987 +0000 UTC m=+1302.312019919" Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.628867 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629426 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" containerID="cri-o://5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629559 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" containerID="cri-o://e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629601 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" containerID="cri-o://00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.629638 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" containerID="cri-o://2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab" gracePeriod=30 Jan 21 14:51:04 crc kubenswrapper[4720]: I0121 14:51:04.737529 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.177:3000/\": read tcp 10.217.0.2:39390->10.217.0.177:3000: read: connection reset by peer" Jan 21 14:51:05 crc kubenswrapper[4720]: E0121 14:51:05.001158 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12d3be7d_16ff_43df_a7d5_266f2b1d4308.slice/crio-conmon-e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1.scope\": RecentStats: unable to find data in memory cache]" Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.280995 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.177:3000/\": dial tcp 10.217.0.177:3000: connect: connection refused" Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.347941 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prdsm" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" probeResult="failure" output=< Jan 21 14:51:05 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 14:51:05 crc kubenswrapper[4720]: > Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361438 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1" exitCode=0 Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361481 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557" exitCode=2 Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361493 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6" exitCode=0 Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361533 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361561 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.361573 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.369843 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerStarted","Data":"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203"} Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.389986 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" podStartSLOduration=5.389970538 podStartE2EDuration="5.389970538s" podCreationTimestamp="2026-01-21 14:51:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:05.388962222 +0000 UTC m=+1303.297702154" watchObservedRunningTime="2026-01-21 14:51:05.389970538 +0000 UTC m=+1303.298710470" Jan 21 14:51:05 crc kubenswrapper[4720]: I0121 14:51:05.474448 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.385495 4720 generic.go:334] "Generic (PLEG): container finished" podID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerID="2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab" exitCode=0 Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.385735 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab"} Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.508707 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590723 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590773 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590802 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590839 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590870 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.590999 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591039 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591060 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") pod \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\" (UID: \"12d3be7d-16ff-43df-a7d5-266f2b1d4308\") " Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591443 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.591562 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.596163 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts" (OuterVolumeSpecName: "scripts") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.600029 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp" (OuterVolumeSpecName: "kube-api-access-2fwtp") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "kube-api-access-2fwtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.615016 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.663886 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693063 4720 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693092 4720 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693101 4720 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693111 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fwtp\" (UniqueName: \"kubernetes.io/projected/12d3be7d-16ff-43df-a7d5-266f2b1d4308-kube-api-access-2fwtp\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693121 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.693129 4720 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12d3be7d-16ff-43df-a7d5-266f2b1d4308-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.696173 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.708539 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data" (OuterVolumeSpecName: "config-data") pod "12d3be7d-16ff-43df-a7d5-266f2b1d4308" (UID: "12d3be7d-16ff-43df-a7d5-266f2b1d4308"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.750076 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.796470 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:06 crc kubenswrapper[4720]: I0121 14:51:06.796510 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12d3be7d-16ff-43df-a7d5-266f2b1d4308-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.414307 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.416694 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"12d3be7d-16ff-43df-a7d5-266f2b1d4308","Type":"ContainerDied","Data":"095eb50a9535270990dd51b698c9cf80b5e404f52878168a52724d7ce2256d5b"} Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.416728 4720 scope.go:117] "RemoveContainer" containerID="e559aa0454450a58853ddf4a50d570c328194ddcfdc0c3464369f91c9c21dee1" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.459749 4720 scope.go:117] "RemoveContainer" containerID="00334da872f5838cba633c1df535018745aa753f28eccf9d38d17542b2d83557" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.468441 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.478515 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506347 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506825 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506840 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506854 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506869 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506892 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506900 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: E0121 14:51:07.506922 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.506928 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507095 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="proxy-httpd" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507110 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-notification-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507118 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="ceilometer-central-agent" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.507134 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" containerName="sg-core" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.508814 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.511471 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.511782 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.518367 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.519099 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.519602 4720 scope.go:117] "RemoveContainer" containerID="2e4657d7e4e72bd0317d9a4097deb7850089e8ed3ac24d33b7e7cfaf4e9621ab" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.577938 4720 scope.go:117] "RemoveContainer" containerID="5d5bd9374ec63c57d2fd3d7df27f609498666bfefedad7b65b9e20757f2269d6" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613011 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-config-data\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-scripts\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613429 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-log-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613556 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613698 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4czs\" (UniqueName: \"kubernetes.io/projected/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-kube-api-access-x4czs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613779 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-run-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613873 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.613975 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.715964 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-scripts\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-log-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716023 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716058 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4czs\" (UniqueName: \"kubernetes.io/projected/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-kube-api-access-x4czs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716076 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-run-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716100 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716116 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.716176 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-config-data\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.717728 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-log-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.717999 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-run-httpd\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.721231 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.723826 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.724245 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-scripts\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.724477 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-config-data\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.726689 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.740785 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4czs\" (UniqueName: \"kubernetes.io/projected/2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe-kube-api-access-x4czs\") pod \"ceilometer-0\" (UID: \"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe\") " pod="openstack/ceilometer-0" Jan 21 14:51:07 crc kubenswrapper[4720]: I0121 14:51:07.881775 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.372119 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 21 14:51:08 crc kubenswrapper[4720]: W0121 14:51:08.377015 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d3ebf5b_f0c4_472e_b4a3_e5f8cab66ffe.slice/crio-6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7 WatchSource:0}: Error finding container 6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7: Status 404 returned error can't find the container with id 6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7 Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.446492 4720 generic.go:334] "Generic (PLEG): container finished" podID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerID="58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778" exitCode=0 Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.447112 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerDied","Data":"58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778"} Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.451639 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"6145a1a426dc81c6cf4893eae54d0d3bb01c00551e04e8fd2aeb03ed6790fba7"} Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.611459 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.689176 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d3be7d-16ff-43df-a7d5-266f2b1d4308" path="/var/lib/kubelet/pods/12d3be7d-16ff-43df-a7d5-266f2b1d4308/volumes" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736689 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736787 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.736842 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") pod \"a072861f-6e44-4b30-8666-7dc9b0e2078e\" (UID: \"a072861f-6e44-4b30-8666-7dc9b0e2078e\") " Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.737321 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs" (OuterVolumeSpecName: "logs") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.746514 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24" (OuterVolumeSpecName: "kube-api-access-vxg24") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "kube-api-access-vxg24". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.772120 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.775325 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data" (OuterVolumeSpecName: "config-data") pod "a072861f-6e44-4b30-8666-7dc9b0e2078e" (UID: "a072861f-6e44-4b30-8666-7dc9b0e2078e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838863 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a072861f-6e44-4b30-8666-7dc9b0e2078e-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838901 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxg24\" (UniqueName: \"kubernetes.io/projected/a072861f-6e44-4b30-8666-7dc9b0e2078e-kube-api-access-vxg24\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838917 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:08 crc kubenswrapper[4720]: I0121 14:51:08.838930 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a072861f-6e44-4b30-8666-7dc9b0e2078e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.461502 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"27ad18ebd592c9ea8551cc0959933b0dd3090402e1c4eacace1304ec6a39a7d4"} Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.465673 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a072861f-6e44-4b30-8666-7dc9b0e2078e","Type":"ContainerDied","Data":"1ac2709f0fcedf3f81608ca1a0f69ad5080c7f047fe98d3ffad1aa7ecce36ad0"} Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.465723 4720 scope.go:117] "RemoveContainer" containerID="58f63d184705ab6432d547aaa2cb911c01794d62ad6af01203240526c283f778" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.465727 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.491360 4720 scope.go:117] "RemoveContainer" containerID="970bcf65e5a09e77255def09f218971569442aab8e3ec98881ef4af0f5c9e750" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.507201 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.519571 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.538404 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: E0121 14:51:09.538877 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.538900 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" Jan 21 14:51:09 crc kubenswrapper[4720]: E0121 14:51:09.538916 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.538925 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.539137 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-api" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.539154 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" containerName="nova-api-log" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.540256 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.541626 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.542267 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.543719 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.556604 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655123 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655267 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655299 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655315 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655515 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.655565 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.757141 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.757186 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.757913 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758168 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758247 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.758702 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.763214 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.763855 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.766068 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.770992 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.783462 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"nova-api-0\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " pod="openstack/nova-api-0" Jan 21 14:51:09 crc kubenswrapper[4720]: I0121 14:51:09.860533 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.377555 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.476984 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.515313 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerStarted","Data":"4bb2d185eecf673998e9d879b9729a3d12306e355feaebcc9b978ba415abebc0"} Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.584048 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.584320 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" containerID="cri-o://ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d" gracePeriod=10 Jan 21 14:51:10 crc kubenswrapper[4720]: I0121 14:51:10.738615 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a072861f-6e44-4b30-8666-7dc9b0e2078e" path="/var/lib/kubelet/pods/a072861f-6e44-4b30-8666-7dc9b0e2078e/volumes" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.239757 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.173:5353: connect: connection refused" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.524035 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"af79cf129c292c74d6920bee5c752043bb2bfd7a327724ba70c9e06d8b214ebc"} Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.526330 4720 generic.go:334] "Generic (PLEG): container finished" podID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerID="ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d" exitCode=0 Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.526375 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerDied","Data":"ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d"} Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.527882 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerStarted","Data":"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3"} Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.669012 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.727562 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.742993 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792221 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792281 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792334 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792372 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.792479 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") pod \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\" (UID: \"0f51fb54-b6cb-4a03-b378-714f549cd2a1\") " Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.799966 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz" (OuterVolumeSpecName: "kube-api-access-7bnkz") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "kube-api-access-7bnkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.845256 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.850322 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.899501 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bnkz\" (UniqueName: \"kubernetes.io/projected/0f51fb54-b6cb-4a03-b378-714f549cd2a1-kube-api-access-7bnkz\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.899532 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.899542 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.918259 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config" (OuterVolumeSpecName: "config") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:11 crc kubenswrapper[4720]: I0121 14:51:11.943567 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f51fb54-b6cb-4a03-b378-714f549cd2a1" (UID: "0f51fb54-b6cb-4a03-b378-714f549cd2a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.001322 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.001686 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f51fb54-b6cb-4a03-b378-714f549cd2a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.538339 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"aa28edecd3076d27d100618795053d2ff9074f6f62d9d1c55a24a5b14a96f5c5"} Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.540538 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" event={"ID":"0f51fb54-b6cb-4a03-b378-714f549cd2a1","Type":"ContainerDied","Data":"380b121eb00fcd604ad13adc40e1a9168307fc04dc188009c76758ceb903fd8f"} Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.540626 4720 scope.go:117] "RemoveContainer" containerID="ec586fa0a9aadd81cd43534105b8c70d16abdfd751fc33f2fa3e2b01263a5d1d" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.540815 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-mbq5w" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.545910 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerStarted","Data":"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22"} Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.566536 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.573564 4720 scope.go:117] "RemoveContainer" containerID="c53b9b942e3700ab88cecf03857239f8c69e629fa546f404751ee79be8529e6b" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.581018 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.581000381 podStartE2EDuration="3.581000381s" podCreationTimestamp="2026-01-21 14:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:12.577353643 +0000 UTC m=+1310.486093575" watchObservedRunningTime="2026-01-21 14:51:12.581000381 +0000 UTC m=+1310.489740313" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.615610 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.628183 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-mbq5w"] Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.700989 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" path="/var/lib/kubelet/pods/0f51fb54-b6cb-4a03-b378-714f549cd2a1/volumes" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.807551 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 14:51:12 crc kubenswrapper[4720]: E0121 14:51:12.807928 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="init" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.807946 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="init" Jan 21 14:51:12 crc kubenswrapper[4720]: E0121 14:51:12.807956 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.807962 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.809154 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f51fb54-b6cb-4a03-b378-714f549cd2a1" containerName="dnsmasq-dns" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.809929 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.812942 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.813134 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.821053 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.927398 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.927759 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.928026 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:12 crc kubenswrapper[4720]: I0121 14:51:12.928243 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029523 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029686 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029718 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.029737 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.036055 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.036319 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.037044 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.054415 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"nova-cell1-cell-mapping-7qf47\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.125755 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.569024 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe","Type":"ContainerStarted","Data":"c39eea449240cfba58293812deb344598d0485713df04e5bf59b49f0a0f0cbdf"} Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.571201 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 21 14:51:13 crc kubenswrapper[4720]: W0121 14:51:13.626470 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8fc07ed_67cb_4459_b7cb_ea8101ea4317.slice/crio-81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa WatchSource:0}: Error finding container 81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa: Status 404 returned error can't find the container with id 81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.632456 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.905445861 podStartE2EDuration="6.632437197s" podCreationTimestamp="2026-01-21 14:51:07 +0000 UTC" firstStartedPulling="2026-01-21 14:51:08.380379586 +0000 UTC m=+1306.289119508" lastFinishedPulling="2026-01-21 14:51:13.107370912 +0000 UTC m=+1311.016110844" observedRunningTime="2026-01-21 14:51:13.613692669 +0000 UTC m=+1311.522432621" watchObservedRunningTime="2026-01-21 14:51:13.632437197 +0000 UTC m=+1311.541177129" Jan 21 14:51:13 crc kubenswrapper[4720]: I0121 14:51:13.646576 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.344472 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.398800 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.579605 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerStarted","Data":"08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac"} Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.579675 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerStarted","Data":"81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa"} Jan 21 14:51:14 crc kubenswrapper[4720]: I0121 14:51:14.596360 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7qf47" podStartSLOduration=2.596346232 podStartE2EDuration="2.596346232s" podCreationTimestamp="2026-01-21 14:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:14.59593515 +0000 UTC m=+1312.504675082" watchObservedRunningTime="2026-01-21 14:51:14.596346232 +0000 UTC m=+1312.505086164" Jan 21 14:51:15 crc kubenswrapper[4720]: I0121 14:51:15.165546 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:51:15 crc kubenswrapper[4720]: I0121 14:51:15.585987 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-prdsm" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" containerID="cri-o://0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" gracePeriod=2 Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.110111 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.235129 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") pod \"5825e26f-385a-4384-a0e6-18a04e49ddf7\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.235299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") pod \"5825e26f-385a-4384-a0e6-18a04e49ddf7\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.235412 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") pod \"5825e26f-385a-4384-a0e6-18a04e49ddf7\" (UID: \"5825e26f-385a-4384-a0e6-18a04e49ddf7\") " Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.236435 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities" (OuterVolumeSpecName: "utilities") pod "5825e26f-385a-4384-a0e6-18a04e49ddf7" (UID: "5825e26f-385a-4384-a0e6-18a04e49ddf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.242169 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5" (OuterVolumeSpecName: "kube-api-access-5mnj5") pod "5825e26f-385a-4384-a0e6-18a04e49ddf7" (UID: "5825e26f-385a-4384-a0e6-18a04e49ddf7"). InnerVolumeSpecName "kube-api-access-5mnj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.337374 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.337424 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mnj5\" (UniqueName: \"kubernetes.io/projected/5825e26f-385a-4384-a0e6-18a04e49ddf7-kube-api-access-5mnj5\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.361786 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5825e26f-385a-4384-a0e6-18a04e49ddf7" (UID: "5825e26f-385a-4384-a0e6-18a04e49ddf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.438926 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5825e26f-385a-4384-a0e6-18a04e49ddf7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596228 4720 generic.go:334] "Generic (PLEG): container finished" podID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" exitCode=0 Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3"} Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596584 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdsm" event={"ID":"5825e26f-385a-4384-a0e6-18a04e49ddf7","Type":"ContainerDied","Data":"abc55b9d285c58116da7b148c4e87092b6925a74a8b7d4e6ff71e53eb61cdc76"} Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596602 4720 scope.go:117] "RemoveContainer" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.596760 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdsm" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.637620 4720 scope.go:117] "RemoveContainer" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.666161 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.674683 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-prdsm"] Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.679396 4720 scope.go:117] "RemoveContainer" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.691945 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" path="/var/lib/kubelet/pods/5825e26f-385a-4384-a0e6-18a04e49ddf7/volumes" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705034 4720 scope.go:117] "RemoveContainer" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" Jan 21 14:51:16 crc kubenswrapper[4720]: E0121 14:51:16.705423 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3\": container with ID starting with 0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3 not found: ID does not exist" containerID="0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705454 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3"} err="failed to get container status \"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3\": rpc error: code = NotFound desc = could not find container \"0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3\": container with ID starting with 0a9a0f7c5df4c76c0caa2f798f4b2308333087ebb4d4bf1855716ce187fae8d3 not found: ID does not exist" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705481 4720 scope.go:117] "RemoveContainer" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" Jan 21 14:51:16 crc kubenswrapper[4720]: E0121 14:51:16.705864 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab\": container with ID starting with 2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab not found: ID does not exist" containerID="2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705885 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab"} err="failed to get container status \"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab\": rpc error: code = NotFound desc = could not find container \"2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab\": container with ID starting with 2afc82d2610bfc44ba979738c0ceeef7f7d90c62e91e764f5284e10ea48d21ab not found: ID does not exist" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.705900 4720 scope.go:117] "RemoveContainer" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" Jan 21 14:51:16 crc kubenswrapper[4720]: E0121 14:51:16.706118 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414\": container with ID starting with 8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414 not found: ID does not exist" containerID="8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414" Jan 21 14:51:16 crc kubenswrapper[4720]: I0121 14:51:16.706165 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414"} err="failed to get container status \"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414\": rpc error: code = NotFound desc = could not find container \"8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414\": container with ID starting with 8c7954ba9a69f367fe5dd10c0acf22cad0338ae74bc42c2b1b294122ddc99414 not found: ID does not exist" Jan 21 14:51:18 crc kubenswrapper[4720]: I0121 14:51:18.615962 4720 generic.go:334] "Generic (PLEG): container finished" podID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerID="08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac" exitCode=0 Jan 21 14:51:18 crc kubenswrapper[4720]: I0121 14:51:18.616041 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerDied","Data":"08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac"} Jan 21 14:51:19 crc kubenswrapper[4720]: I0121 14:51:19.861174 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:19 crc kubenswrapper[4720]: I0121 14:51:19.861489 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:19 crc kubenswrapper[4720]: I0121 14:51:19.979691 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123750 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123862 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123895 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.123955 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") pod \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\" (UID: \"d8fc07ed-67cb-4459-b7cb-ea8101ea4317\") " Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.130611 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml" (OuterVolumeSpecName: "kube-api-access-rkgml") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "kube-api-access-rkgml". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.152082 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts" (OuterVolumeSpecName: "scripts") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.157940 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data" (OuterVolumeSpecName: "config-data") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.163880 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8fc07ed-67cb-4459-b7cb-ea8101ea4317" (UID: "d8fc07ed-67cb-4459-b7cb-ea8101ea4317"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226186 4720 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-scripts\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226222 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgml\" (UniqueName: \"kubernetes.io/projected/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-kube-api-access-rkgml\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226235 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.226243 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8fc07ed-67cb-4459-b7cb-ea8101ea4317-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.638797 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7qf47" event={"ID":"d8fc07ed-67cb-4459-b7cb-ea8101ea4317","Type":"ContainerDied","Data":"81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa"} Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.639062 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81ffa8c47b30455ffc34b4535407b825c43082cf4376435684682b0464290caa" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.639150 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7qf47" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.846813 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.847330 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" containerID="cri-o://0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.847416 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" containerID="cri-o://534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.858026 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.858253 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" containerID="cri-o://0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.876909 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.877128 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" containerID="cri-o://9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.877532 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" containerID="cri-o://2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" gracePeriod=30 Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.880221 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:20 crc kubenswrapper[4720]: I0121 14:51:20.880221 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.186:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.649918 4720 generic.go:334] "Generic (PLEG): container finished" podID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" exitCode=143 Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.649988 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerDied","Data":"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6"} Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.651837 4720 generic.go:334] "Generic (PLEG): container finished" podID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" exitCode=143 Jan 21 14:51:21 crc kubenswrapper[4720]: I0121 14:51:21.651897 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerDied","Data":"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3"} Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.234807 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.384808 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") pod \"c65466e3-8bac-41f3-855f-202b0a6f9e82\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.385005 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") pod \"c65466e3-8bac-41f3-855f-202b0a6f9e82\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.385043 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") pod \"c65466e3-8bac-41f3-855f-202b0a6f9e82\" (UID: \"c65466e3-8bac-41f3-855f-202b0a6f9e82\") " Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.398599 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47" (OuterVolumeSpecName: "kube-api-access-5gw47") pod "c65466e3-8bac-41f3-855f-202b0a6f9e82" (UID: "c65466e3-8bac-41f3-855f-202b0a6f9e82"). InnerVolumeSpecName "kube-api-access-5gw47". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.411824 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data" (OuterVolumeSpecName: "config-data") pod "c65466e3-8bac-41f3-855f-202b0a6f9e82" (UID: "c65466e3-8bac-41f3-855f-202b0a6f9e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.421645 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c65466e3-8bac-41f3-855f-202b0a6f9e82" (UID: "c65466e3-8bac-41f3-855f-202b0a6f9e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.487827 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.487877 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gw47\" (UniqueName: \"kubernetes.io/projected/c65466e3-8bac-41f3-855f-202b0a6f9e82-kube-api-access-5gw47\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.487896 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65466e3-8bac-41f3-855f-202b0a6f9e82-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.672629 4720 generic.go:334] "Generic (PLEG): container finished" podID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" exitCode=0 Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.672688 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerDied","Data":"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f"} Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.673095 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c65466e3-8bac-41f3-855f-202b0a6f9e82","Type":"ContainerDied","Data":"6a4188e9bbe7707a1cbd5fc7c33ecb7166835f18ea14b69fcb7fc8e351f09029"} Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.672707 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.673165 4720 scope.go:117] "RemoveContainer" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.697259 4720 scope.go:117] "RemoveContainer" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.697634 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f\": container with ID starting with 0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f not found: ID does not exist" containerID="0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.697776 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f"} err="failed to get container status \"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f\": rpc error: code = NotFound desc = could not find container \"0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f\": container with ID starting with 0229617962c429f7fcb15b291fd09407ae1937ea495d5cde0e4d779fd887ce9f not found: ID does not exist" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.719542 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.727992 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.733695 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734046 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734065 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734085 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-utilities" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734091 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-utilities" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734106 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734112 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734125 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerName="nova-manage" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734131 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerName="nova-manage" Jan 21 14:51:23 crc kubenswrapper[4720]: E0121 14:51:23.734143 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-content" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734150 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="extract-content" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734293 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5825e26f-385a-4384-a0e6-18a04e49ddf7" containerName="registry-server" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734309 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" containerName="nova-scheduler-scheduler" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734323 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" containerName="nova-manage" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.734841 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.738374 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.756766 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.793079 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.793120 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg4f\" (UniqueName: \"kubernetes.io/projected/039c7115-f471-47ad-a7c4-75b1d7a40a94-kube-api-access-jkg4f\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.793150 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-config-data\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.894308 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.894363 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg4f\" (UniqueName: \"kubernetes.io/projected/039c7115-f471-47ad-a7c4-75b1d7a40a94-kube-api-access-jkg4f\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.894387 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-config-data\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.897973 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.899118 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/039c7115-f471-47ad-a7c4-75b1d7a40a94-config-data\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:23 crc kubenswrapper[4720]: I0121 14:51:23.913397 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg4f\" (UniqueName: \"kubernetes.io/projected/039c7115-f471-47ad-a7c4-75b1d7a40a94-kube-api-access-jkg4f\") pod \"nova-scheduler-0\" (UID: \"039c7115-f471-47ad-a7c4-75b1d7a40a94\") " pod="openstack/nova-scheduler-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.017158 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:38262->10.217.0.178:8775: read: connection reset by peer" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.017593 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.178:8775/\": read tcp 10.217.0.2:38254->10.217.0.178:8775: read: connection reset by peer" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.050189 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.490401 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.493205 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605210 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605273 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605453 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605500 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.605611 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") pod \"cc263e55-641f-47c7-ac02-f863d7cafa11\" (UID: \"cc263e55-641f-47c7-ac02-f863d7cafa11\") " Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.607930 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs" (OuterVolumeSpecName: "logs") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.613851 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682" (OuterVolumeSpecName: "kube-api-access-vs682") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "kube-api-access-vs682". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.641036 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.641866 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data" (OuterVolumeSpecName: "config-data") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.680214 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cc263e55-641f-47c7-ac02-f863d7cafa11" (UID: "cc263e55-641f-47c7-ac02-f863d7cafa11"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.687930 4720 generic.go:334] "Generic (PLEG): container finished" podID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" exitCode=0 Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.688013 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.701427 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65466e3-8bac-41f3-855f-202b0a6f9e82" path="/var/lib/kubelet/pods/c65466e3-8bac-41f3-855f-202b0a6f9e82/volumes" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702586 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039c7115-f471-47ad-a7c4-75b1d7a40a94","Type":"ContainerStarted","Data":"8b5443bd6e0295f14b7abeec2709f0ba24bba33a6203357b06cd4d671535736d"} Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702614 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerDied","Data":"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f"} Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702630 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cc263e55-641f-47c7-ac02-f863d7cafa11","Type":"ContainerDied","Data":"9bd73af5fd59322a2bd5b4dadb3b5852cd6bfb2cf195e8e11949965c74ef70f1"} Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.702649 4720 scope.go:117] "RemoveContainer" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709458 4720 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709493 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709504 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc263e55-641f-47c7-ac02-f863d7cafa11-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709515 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs682\" (UniqueName: \"kubernetes.io/projected/cc263e55-641f-47c7-ac02-f863d7cafa11-kube-api-access-vs682\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.709526 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc263e55-641f-47c7-ac02-f863d7cafa11-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.734997 4720 scope.go:117] "RemoveContainer" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.744126 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.752688 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.768514 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.768967 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.768985 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.769015 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.769022 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.769178 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-metadata" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.769198 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" containerName="nova-metadata-log" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.771635 4720 scope.go:117] "RemoveContainer" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.777230 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.778482 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f\": container with ID starting with 2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f not found: ID does not exist" containerID="2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.778520 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f"} err="failed to get container status \"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f\": rpc error: code = NotFound desc = could not find container \"2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f\": container with ID starting with 2f1103123dd5e075dacd8fa477da83f98abac31eddfad5f03662900eaa3eaf4f not found: ID does not exist" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.778590 4720 scope.go:117] "RemoveContainer" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.779425 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.779792 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 21 14:51:24 crc kubenswrapper[4720]: E0121 14:51:24.781237 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6\": container with ID starting with 9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6 not found: ID does not exist" containerID="9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.781274 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6"} err="failed to get container status \"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6\": rpc error: code = NotFound desc = could not find container \"9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6\": container with ID starting with 9057ae4407090c3f0de53130117d8031d3cff0a8a571aa339b38f2bb43a521c6 not found: ID does not exist" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.802716 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810264 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-config-data\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810315 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810340 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7177980c-4db3-4902-aac2-c0825b778b2a-logs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810434 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.810532 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5ff\" (UniqueName: \"kubernetes.io/projected/7177980c-4db3-4902-aac2-c0825b778b2a-kube-api-access-kq5ff\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913866 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5ff\" (UniqueName: \"kubernetes.io/projected/7177980c-4db3-4902-aac2-c0825b778b2a-kube-api-access-kq5ff\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913942 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-config-data\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913962 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.913983 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7177980c-4db3-4902-aac2-c0825b778b2a-logs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.914023 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.915283 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7177980c-4db3-4902-aac2-c0825b778b2a-logs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.919821 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.926144 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-config-data\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.929357 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7177980c-4db3-4902-aac2-c0825b778b2a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:24 crc kubenswrapper[4720]: I0121 14:51:24.934339 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5ff\" (UniqueName: \"kubernetes.io/projected/7177980c-4db3-4902-aac2-c0825b778b2a-kube-api-access-kq5ff\") pod \"nova-metadata-0\" (UID: \"7177980c-4db3-4902-aac2-c0825b778b2a\") " pod="openstack/nova-metadata-0" Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.103892 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.588225 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.698320 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7177980c-4db3-4902-aac2-c0825b778b2a","Type":"ContainerStarted","Data":"3398fbc0a33494e3166dc5a170a67b9b49df448837581f4a1cc4f92a0e2e21de"} Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.699620 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"039c7115-f471-47ad-a7c4-75b1d7a40a94","Type":"ContainerStarted","Data":"c53420e4b575d7a9b57dd50454fd5a1bb2b67341f75d699a3bd146fa3d5c109b"} Jan 21 14:51:25 crc kubenswrapper[4720]: I0121 14:51:25.717834 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.717791289 podStartE2EDuration="2.717791289s" podCreationTimestamp="2026-01-21 14:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:25.715698582 +0000 UTC m=+1323.624438534" watchObservedRunningTime="2026-01-21 14:51:25.717791289 +0000 UTC m=+1323.626531231" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.704120 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc263e55-641f-47c7-ac02-f863d7cafa11" path="/var/lib/kubelet/pods/cc263e55-641f-47c7-ac02-f863d7cafa11/volumes" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731013 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731243 4720 generic.go:334] "Generic (PLEG): container finished" podID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" exitCode=0 Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731378 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerDied","Data":"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731455 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"103562f8-b254-4684-80a8-5e6ff5160cfd","Type":"ContainerDied","Data":"4bb2d185eecf673998e9d879b9729a3d12306e355feaebcc9b978ba415abebc0"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.731477 4720 scope.go:117] "RemoveContainer" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.735351 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7177980c-4db3-4902-aac2-c0825b778b2a","Type":"ContainerStarted","Data":"0ccd76eb91ea3922e26d1bd6af83a1762bbeac0de1ab86d5fbb2cc92068f0849"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.735436 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7177980c-4db3-4902-aac2-c0825b778b2a","Type":"ContainerStarted","Data":"37a84f7f848bd89032dcbc59d157fe027e7dd905add3c4033c58af74f35ebc9e"} Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752563 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752618 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752640 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752694 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752753 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.752806 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") pod \"103562f8-b254-4684-80a8-5e6ff5160cfd\" (UID: \"103562f8-b254-4684-80a8-5e6ff5160cfd\") " Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.755893 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs" (OuterVolumeSpecName: "logs") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.763445 4720 scope.go:117] "RemoveContainer" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.807032 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.809438 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v" (OuterVolumeSpecName: "kube-api-access-f565v") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "kube-api-access-f565v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.816930 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data" (OuterVolumeSpecName: "config-data") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.827022 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.82699777 podStartE2EDuration="2.82699777s" podCreationTimestamp="2026-01-21 14:51:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:26.807785859 +0000 UTC m=+1324.716525821" watchObservedRunningTime="2026-01-21 14:51:26.82699777 +0000 UTC m=+1324.735737702" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.840773 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.851634 4720 scope.go:117] "RemoveContainer" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" Jan 21 14:51:26 crc kubenswrapper[4720]: E0121 14:51:26.853824 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22\": container with ID starting with 534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22 not found: ID does not exist" containerID="534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.853864 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22"} err="failed to get container status \"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22\": rpc error: code = NotFound desc = could not find container \"534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22\": container with ID starting with 534ecbf69be69a494a611503e1a8227ae178292371a91be06471fe6b3c4f7c22 not found: ID does not exist" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.853890 4720 scope.go:117] "RemoveContainer" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" Jan 21 14:51:26 crc kubenswrapper[4720]: E0121 14:51:26.854151 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3\": container with ID starting with 0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3 not found: ID does not exist" containerID="0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854173 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3"} err="failed to get container status \"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3\": rpc error: code = NotFound desc = could not find container \"0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3\": container with ID starting with 0d8f61c10eeb188ef15e3e02849cf9f08da31fa9b55964abb17c6472a740e2b3 not found: ID does not exist" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854294 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854364 4720 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/103562f8-b254-4684-80a8-5e6ff5160cfd-logs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854374 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f565v\" (UniqueName: \"kubernetes.io/projected/103562f8-b254-4684-80a8-5e6ff5160cfd-kube-api-access-f565v\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854385 4720 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.854394 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.863694 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "103562f8-b254-4684-80a8-5e6ff5160cfd" (UID: "103562f8-b254-4684-80a8-5e6ff5160cfd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:51:26 crc kubenswrapper[4720]: I0121 14:51:26.955634 4720 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/103562f8-b254-4684-80a8-5e6ff5160cfd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.750588 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.790753 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.797753 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.815832 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: E0121 14:51:27.816264 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816285 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" Jan 21 14:51:27 crc kubenswrapper[4720]: E0121 14:51:27.816311 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816319 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816513 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-api" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.816552 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" containerName="nova-api-log" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.821365 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.823615 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.824787 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.824974 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.849278 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-config-data\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870536 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c62270-7ab4-416b-bf5f-e0007f477733-logs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870742 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870827 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-public-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.870878 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp97j\" (UniqueName: \"kubernetes.io/projected/33c62270-7ab4-416b-bf5f-e0007f477733-kube-api-access-pp97j\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.871035 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971813 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c62270-7ab4-416b-bf5f-e0007f477733-logs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971871 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971896 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-public-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971915 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp97j\" (UniqueName: \"kubernetes.io/projected/33c62270-7ab4-416b-bf5f-e0007f477733-kube-api-access-pp97j\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.971956 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.972000 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-config-data\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.972949 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/33c62270-7ab4-416b-bf5f-e0007f477733-logs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.977316 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-internal-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.977405 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.978171 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-config-data\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.980138 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c62270-7ab4-416b-bf5f-e0007f477733-public-tls-certs\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:27 crc kubenswrapper[4720]: I0121 14:51:27.989470 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp97j\" (UniqueName: \"kubernetes.io/projected/33c62270-7ab4-416b-bf5f-e0007f477733-kube-api-access-pp97j\") pod \"nova-api-0\" (UID: \"33c62270-7ab4-416b-bf5f-e0007f477733\") " pod="openstack/nova-api-0" Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.147833 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.579108 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 21 14:51:28 crc kubenswrapper[4720]: W0121 14:51:28.585297 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33c62270_7ab4_416b_bf5f_e0007f477733.slice/crio-3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9 WatchSource:0}: Error finding container 3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9: Status 404 returned error can't find the container with id 3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9 Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.687444 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103562f8-b254-4684-80a8-5e6ff5160cfd" path="/var/lib/kubelet/pods/103562f8-b254-4684-80a8-5e6ff5160cfd/volumes" Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.766186 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33c62270-7ab4-416b-bf5f-e0007f477733","Type":"ContainerStarted","Data":"570c387bcd4121303196e6f33a68b38bb1c295552bf10b14bf44b961167bcd92"} Jan 21 14:51:28 crc kubenswrapper[4720]: I0121 14:51:28.766244 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33c62270-7ab4-416b-bf5f-e0007f477733","Type":"ContainerStarted","Data":"3b3464da7ab2557469796af7d2c954c5455a2bd8281b747d9094c865089c84b9"} Jan 21 14:51:29 crc kubenswrapper[4720]: I0121 14:51:29.050375 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 21 14:51:29 crc kubenswrapper[4720]: I0121 14:51:29.776633 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"33c62270-7ab4-416b-bf5f-e0007f477733","Type":"ContainerStarted","Data":"341932c24f87afeb662859ab55a926abbf6d0d46a11f9f65cfe94db229d33f65"} Jan 21 14:51:30 crc kubenswrapper[4720]: I0121 14:51:30.104514 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:51:30 crc kubenswrapper[4720]: I0121 14:51:30.104573 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.050495 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.076647 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.094101 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=7.094078334 podStartE2EDuration="7.094078334s" podCreationTimestamp="2026-01-21 14:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:51:29.796581483 +0000 UTC m=+1327.705321435" watchObservedRunningTime="2026-01-21 14:51:34.094078334 +0000 UTC m=+1332.002818286" Jan 21 14:51:34 crc kubenswrapper[4720]: I0121 14:51:34.861437 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 21 14:51:35 crc kubenswrapper[4720]: I0121 14:51:35.105129 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:51:35 crc kubenswrapper[4720]: I0121 14:51:35.105188 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 21 14:51:36 crc kubenswrapper[4720]: I0121 14:51:36.120863 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7177980c-4db3-4902-aac2-c0825b778b2a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:36 crc kubenswrapper[4720]: I0121 14:51:36.121192 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7177980c-4db3-4902-aac2-c0825b778b2a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:37 crc kubenswrapper[4720]: I0121 14:51:37.892133 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 21 14:51:38 crc kubenswrapper[4720]: I0121 14:51:38.148676 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:38 crc kubenswrapper[4720]: I0121 14:51:38.148849 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 21 14:51:39 crc kubenswrapper[4720]: I0121 14:51:39.156884 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33c62270-7ab4-416b-bf5f-e0007f477733" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:39 crc kubenswrapper[4720]: I0121 14:51:39.163908 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="33c62270-7ab4-416b-bf5f-e0007f477733" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.190:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.110076 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.113044 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.116857 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:51:45 crc kubenswrapper[4720]: I0121 14:51:45.947425 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.155985 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.157078 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.158041 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.179613 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.957429 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 21 14:51:48 crc kubenswrapper[4720]: I0121 14:51:48.966055 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 21 14:51:58 crc kubenswrapper[4720]: I0121 14:51:58.046672 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:51:59 crc kubenswrapper[4720]: I0121 14:51:59.751045 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:02 crc kubenswrapper[4720]: I0121 14:52:02.672486 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" containerID="cri-o://41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c" gracePeriod=604796 Jan 21 14:52:03 crc kubenswrapper[4720]: I0121 14:52:03.726146 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" containerID="cri-o://9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e" gracePeriod=604797 Jan 21 14:52:07 crc kubenswrapper[4720]: I0121 14:52:07.444927 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 21 14:52:07 crc kubenswrapper[4720]: I0121 14:52:07.869432 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.116946 4720 generic.go:334] "Generic (PLEG): container finished" podID="3a2eafda-c352-4311-94d5-a1aec1422699" containerID="41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c" exitCode=0 Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.117552 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerDied","Data":"41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c"} Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.220106 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352215 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352272 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352299 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352390 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352436 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352486 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352614 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352641 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352690 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.352716 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") pod \"3a2eafda-c352-4311-94d5-a1aec1422699\" (UID: \"3a2eafda-c352-4311-94d5-a1aec1422699\") " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.353978 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.357067 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.357436 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.360059 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.360376 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info" (OuterVolumeSpecName: "pod-info") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.360818 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.374769 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.374928 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj" (OuterVolumeSpecName: "kube-api-access-lndbj") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "kube-api-access-lndbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.391576 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data" (OuterVolumeSpecName: "config-data") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455020 4720 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3a2eafda-c352-4311-94d5-a1aec1422699-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455296 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455418 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455545 4720 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455618 4720 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455695 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455752 4720 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3a2eafda-c352-4311-94d5-a1aec1422699-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455815 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.455950 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndbj\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-kube-api-access-lndbj\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.462184 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf" (OuterVolumeSpecName: "server-conf") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.490506 4720 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.499693 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3a2eafda-c352-4311-94d5-a1aec1422699" (UID: "3a2eafda-c352-4311-94d5-a1aec1422699"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.558156 4720 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.558187 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3a2eafda-c352-4311-94d5-a1aec1422699-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:09 crc kubenswrapper[4720]: I0121 14:52:09.558200 4720 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3a2eafda-c352-4311-94d5-a1aec1422699-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.135937 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerID="9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e" exitCode=0 Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.136196 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerDied","Data":"9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e"} Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.149453 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3a2eafda-c352-4311-94d5-a1aec1422699","Type":"ContainerDied","Data":"da6b6b430f12d2b56cf212530b8e484bf3b8d0da1c76e1f2c9cac8d57f6efdf2"} Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.149487 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.149509 4720 scope.go:117] "RemoveContainer" containerID="41acd62d6994c3b333557260be3b41ae84ff11452b3f18db90c86f45eaee7f6c" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.182597 4720 scope.go:117] "RemoveContainer" containerID="c4453d3c9ef59902e453daa4adc4cd400e16b0fd0ef2955bff89215fad4b9aed" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.201832 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.209092 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.231841 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: E0121 14:52:10.232279 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="setup-container" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.232301 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="setup-container" Jan 21 14:52:10 crc kubenswrapper[4720]: E0121 14:52:10.232317 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.232326 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.232548 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" containerName="rabbitmq" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.233712 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.242405 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.243761 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.243990 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.244264 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qrxkj" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.244400 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.246021 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.253151 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.269949 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.352084 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.396782 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397133 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfpn9\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-kube-api-access-sfpn9\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397372 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397687 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397832 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.397953 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.398100 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.398257 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.398368 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-config-data\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499404 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499452 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499542 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499574 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499784 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499824 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499847 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499930 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499950 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.499973 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") pod \"c1752995-abec-46de-adf8-da9e3ed99d4a\" (UID: \"c1752995-abec-46de-adf8-da9e3ed99d4a\") " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500171 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-config-data\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500230 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500252 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500268 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfpn9\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-kube-api-access-sfpn9\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500290 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500326 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500351 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500384 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500408 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500447 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500480 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.500956 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.501503 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.502294 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-config-data\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.508851 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.509057 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.509338 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.510340 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db" (OuterVolumeSpecName: "kube-api-access-4f5db") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "kube-api-access-4f5db". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.510964 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.513256 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.513584 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.514518 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.515141 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info" (OuterVolumeSpecName: "pod-info") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.516523 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.517833 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.526994 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.529247 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.535489 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.538709 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.550240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfpn9\" (UniqueName: \"kubernetes.io/projected/f73dd82b-9ad1-4deb-b244-6d42a3f25f89-kube-api-access-sfpn9\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.560144 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data" (OuterVolumeSpecName: "config-data") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.561912 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"f73dd82b-9ad1-4deb-b244-6d42a3f25f89\") " pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.602570 4720 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1752995-abec-46de-adf8-da9e3ed99d4a-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.602809 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603174 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f5db\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-kube-api-access-4f5db\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603548 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603614 4720 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1752995-abec-46de-adf8-da9e3ed99d4a-pod-info\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603712 4720 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603793 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603855 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.603912 4720 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.610823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf" (OuterVolumeSpecName: "server-conf") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.623590 4720 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.641914 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.667479 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c1752995-abec-46de-adf8-da9e3ed99d4a" (UID: "c1752995-abec-46de-adf8-da9e3ed99d4a"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.689942 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a2eafda-c352-4311-94d5-a1aec1422699" path="/var/lib/kubelet/pods/3a2eafda-c352-4311-94d5-a1aec1422699/volumes" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.705046 4720 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1752995-abec-46de-adf8-da9e3ed99d4a-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.705074 4720 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:10 crc kubenswrapper[4720]: I0121 14:52:10.705083 4720 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1752995-abec-46de-adf8-da9e3ed99d4a-server-conf\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.090987 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.168816 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c1752995-abec-46de-adf8-da9e3ed99d4a","Type":"ContainerDied","Data":"348934cdbf75477f1ab960f3f1053dff6dbf9d2daa8c4387234ea6851e521a6d"} Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.168870 4720 scope.go:117] "RemoveContainer" containerID="9c861cf27787d0df1915de176ea7b338ba9e65e509d7002abe91b7eb691fa61e" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.168867 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.171127 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerStarted","Data":"ae02bad546a79ede0b3c6acafdd6a20aac4e570c51c42547e3c791db88948b01"} Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.236792 4720 scope.go:117] "RemoveContainer" containerID="c805233f5325caf425e355c639bbb38416823bf3012c2a9fbf778f7b0bf437ea" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.263969 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.287133 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.333638 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: E0121 14:52:11.333970 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.333984 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" Jan 21 14:52:11 crc kubenswrapper[4720]: E0121 14:52:11.334012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="setup-container" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.334019 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="setup-container" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.334167 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" containerName="rabbitmq" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.335022 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.337960 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.338120 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.338279 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.338420 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.340994 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.340993 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d7vj6" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.344441 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.360899 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.417901 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.417957 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4906b5ed-c663-4e81-ab33-2b8f33777cd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.417987 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418074 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418097 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418116 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418148 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g48tj\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-kube-api-access-g48tj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418288 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418366 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418447 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.418498 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4906b5ed-c663-4e81-ab33-2b8f33777cd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520854 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520922 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520943 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520969 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g48tj\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-kube-api-access-g48tj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.520995 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521018 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521045 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521069 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4906b5ed-c663-4e81-ab33-2b8f33777cd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521111 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4906b5ed-c663-4e81-ab33-2b8f33777cd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521145 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521422 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.521930 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.522737 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.523565 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4906b5ed-c663-4e81-ab33-2b8f33777cd1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.523969 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.524063 4720 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.526255 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4906b5ed-c663-4e81-ab33-2b8f33777cd1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.527582 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.527784 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.531980 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4906b5ed-c663-4e81-ab33-2b8f33777cd1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.542976 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g48tj\" (UniqueName: \"kubernetes.io/projected/4906b5ed-c663-4e81-ab33-2b8f33777cd1-kube-api-access-g48tj\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.553756 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4906b5ed-c663-4e81-ab33-2b8f33777cd1\") " pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:11 crc kubenswrapper[4720]: I0121 14:52:11.653395 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:12 crc kubenswrapper[4720]: I0121 14:52:12.169734 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 21 14:52:12 crc kubenswrapper[4720]: I0121 14:52:12.186180 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerStarted","Data":"3b8bf9c12b304da22eda84d8d4aef9c2b44b6468916c9d1a4ccc32de0a4b3200"} Jan 21 14:52:12 crc kubenswrapper[4720]: I0121 14:52:12.687847 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1752995-abec-46de-adf8-da9e3ed99d4a" path="/var/lib/kubelet/pods/c1752995-abec-46de-adf8-da9e3ed99d4a/volumes" Jan 21 14:52:13 crc kubenswrapper[4720]: I0121 14:52:13.200385 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerStarted","Data":"79b3aabb4928f6b631f8bd790a70c6b51e6a763cb8fc8dcce474163ba33400ba"} Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.211504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerStarted","Data":"18cfbfd34acd13af24c666d9d7a73718d1d3050e2f8b8e7529a2094d7947823e"} Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.923222 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.924853 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.926775 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 21 14:52:14 crc kubenswrapper[4720]: I0121 14:52:14.945964 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001300 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001352 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001411 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001437 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001495 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.001596 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102881 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102931 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102954 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.102996 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.103020 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.103071 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.103840 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.104366 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.105282 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.105819 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.106307 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.125284 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"dnsmasq-dns-6447ccbd8f-2dxkf\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.244373 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:15 crc kubenswrapper[4720]: I0121 14:52:15.692419 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:16 crc kubenswrapper[4720]: I0121 14:52:16.226612 4720 generic.go:334] "Generic (PLEG): container finished" podID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerID="6b2d92d59d9fbd6ef4876eb39419107419a776f3afb8f7e157056e0cac869cb8" exitCode=0 Jan 21 14:52:16 crc kubenswrapper[4720]: I0121 14:52:16.226672 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerDied","Data":"6b2d92d59d9fbd6ef4876eb39419107419a776f3afb8f7e157056e0cac869cb8"} Jan 21 14:52:16 crc kubenswrapper[4720]: I0121 14:52:16.226959 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerStarted","Data":"116479b2273f15b257f5cd3bbc45cad56003eb6d06ec69e9ecf37fc87ad84fef"} Jan 21 14:52:17 crc kubenswrapper[4720]: I0121 14:52:17.236250 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerStarted","Data":"f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650"} Jan 21 14:52:17 crc kubenswrapper[4720]: I0121 14:52:17.237689 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:17 crc kubenswrapper[4720]: I0121 14:52:17.263273 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" podStartSLOduration=3.263254895 podStartE2EDuration="3.263254895s" podCreationTimestamp="2026-01-21 14:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:17.25822392 +0000 UTC m=+1375.166963852" watchObservedRunningTime="2026-01-21 14:52:17.263254895 +0000 UTC m=+1375.171994827" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.245945 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.354728 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.355075 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" containerID="cri-o://1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" gracePeriod=10 Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.476392 4720 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.183:5353: connect: connection refused" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.565052 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-bgxfr"] Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.572798 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.580093 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-bgxfr"] Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722519 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722631 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrvp\" (UniqueName: \"kubernetes.io/projected/248ea464-73a3-4083-bb27-fc2cb7347224-kube-api-access-xfrvp\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722693 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-config\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722731 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722764 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.722820 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.824817 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.824905 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.824990 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.825200 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrvp\" (UniqueName: \"kubernetes.io/projected/248ea464-73a3-4083-bb27-fc2cb7347224-kube-api-access-xfrvp\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.825314 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-config\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.825400 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.826393 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-sb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.826699 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.827099 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-config\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.827574 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-ovsdbserver-nb\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.827773 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/248ea464-73a3-4083-bb27-fc2cb7347224-dns-svc\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.847301 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrvp\" (UniqueName: \"kubernetes.io/projected/248ea464-73a3-4083-bb27-fc2cb7347224-kube-api-access-xfrvp\") pod \"dnsmasq-dns-6c5d8cf46f-bgxfr\" (UID: \"248ea464-73a3-4083-bb27-fc2cb7347224\") " pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.908969 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:25 crc kubenswrapper[4720]: I0121 14:52:25.925065 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033135 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033423 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033445 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033530 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.033552 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") pod \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\" (UID: \"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6\") " Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.042151 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2" (OuterVolumeSpecName: "kube-api-access-wlfx2") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "kube-api-access-wlfx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.084648 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config" (OuterVolumeSpecName: "config") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.101952 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.115467 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.129735 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" (UID: "75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136833 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlfx2\" (UniqueName: \"kubernetes.io/projected/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-kube-api-access-wlfx2\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136873 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136884 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136897 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.136908 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325105 4720 generic.go:334] "Generic (PLEG): container finished" podID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" exitCode=0 Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325160 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerDied","Data":"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203"} Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325190 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" event={"ID":"75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6","Type":"ContainerDied","Data":"62714c9a0f1a3b425c21ab81569bf9c4c0ba1448aea15537467fba81fe36bdf5"} Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325210 4720 scope.go:117] "RemoveContainer" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.325381 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-9p6zm" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.363248 4720 scope.go:117] "RemoveContainer" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.376884 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.391353 4720 scope.go:117] "RemoveContainer" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" Jan 21 14:52:26 crc kubenswrapper[4720]: E0121 14:52:26.392826 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203\": container with ID starting with 1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203 not found: ID does not exist" containerID="1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.392881 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203"} err="failed to get container status \"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203\": rpc error: code = NotFound desc = could not find container \"1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203\": container with ID starting with 1715d91e262791470c33fe547bc682331a4067f0c9ab327b2b7d0a3b75411203 not found: ID does not exist" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.392915 4720 scope.go:117] "RemoveContainer" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" Jan 21 14:52:26 crc kubenswrapper[4720]: E0121 14:52:26.393832 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61\": container with ID starting with 8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61 not found: ID does not exist" containerID="8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.393867 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61"} err="failed to get container status \"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61\": rpc error: code = NotFound desc = could not find container \"8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61\": container with ID starting with 8d26a926c7513c277eda02616206b8d63ce1e5af78c608a35fda9052efcedc61 not found: ID does not exist" Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.395577 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-9p6zm"] Jan 21 14:52:26 crc kubenswrapper[4720]: W0121 14:52:26.400417 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248ea464_73a3_4083_bb27_fc2cb7347224.slice/crio-98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76 WatchSource:0}: Error finding container 98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76: Status 404 returned error can't find the container with id 98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76 Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.403645 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c5d8cf46f-bgxfr"] Jan 21 14:52:26 crc kubenswrapper[4720]: I0121 14:52:26.689227 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" path="/var/lib/kubelet/pods/75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6/volumes" Jan 21 14:52:26 crc kubenswrapper[4720]: E0121 14:52:26.894130 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248ea464_73a3_4083_bb27_fc2cb7347224.slice/crio-a1cd28face0b16a480d00b0be1619baabea48ad080302e25648ffac49d0afe85.scope\": RecentStats: unable to find data in memory cache]" Jan 21 14:52:27 crc kubenswrapper[4720]: I0121 14:52:27.337374 4720 generic.go:334] "Generic (PLEG): container finished" podID="248ea464-73a3-4083-bb27-fc2cb7347224" containerID="a1cd28face0b16a480d00b0be1619baabea48ad080302e25648ffac49d0afe85" exitCode=0 Jan 21 14:52:27 crc kubenswrapper[4720]: I0121 14:52:27.337423 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" event={"ID":"248ea464-73a3-4083-bb27-fc2cb7347224","Type":"ContainerDied","Data":"a1cd28face0b16a480d00b0be1619baabea48ad080302e25648ffac49d0afe85"} Jan 21 14:52:27 crc kubenswrapper[4720]: I0121 14:52:27.337450 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" event={"ID":"248ea464-73a3-4083-bb27-fc2cb7347224","Type":"ContainerStarted","Data":"98161d0e74643b7c2623919d017c9064aa341518c103b5a43586aecba7de3d76"} Jan 21 14:52:28 crc kubenswrapper[4720]: I0121 14:52:28.348199 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" event={"ID":"248ea464-73a3-4083-bb27-fc2cb7347224","Type":"ContainerStarted","Data":"bcf89378c96db8e2b2f7b382806c80c79e5afed950d542ae9620d8e1972262de"} Jan 21 14:52:28 crc kubenswrapper[4720]: I0121 14:52:28.349560 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:28 crc kubenswrapper[4720]: I0121 14:52:28.372603 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" podStartSLOduration=3.3725707209999998 podStartE2EDuration="3.372570721s" podCreationTimestamp="2026-01-21 14:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:28.364515857 +0000 UTC m=+1386.273255809" watchObservedRunningTime="2026-01-21 14:52:28.372570721 +0000 UTC m=+1386.281310653" Jan 21 14:52:35 crc kubenswrapper[4720]: I0121 14:52:35.910954 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c5d8cf46f-bgxfr" Jan 21 14:52:35 crc kubenswrapper[4720]: I0121 14:52:35.995963 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:35 crc kubenswrapper[4720]: I0121 14:52:35.996226 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" containerID="cri-o://f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650" gracePeriod=10 Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.251374 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:36 crc kubenswrapper[4720]: E0121 14:52:36.251810 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="init" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.251825 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="init" Jan 21 14:52:36 crc kubenswrapper[4720]: E0121 14:52:36.251841 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.251849 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.252099 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="75b9e05a-57c8-48d4-9baa-c8b50fd2e9e6" containerName="dnsmasq-dns" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.253561 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.279537 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.321869 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.321910 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.322037 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.416363 4720 generic.go:334] "Generic (PLEG): container finished" podID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerID="f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650" exitCode=0 Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.416411 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerDied","Data":"f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650"} Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.423925 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424051 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424076 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424623 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.424745 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.443698 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"community-operators-5cdm8\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:36 crc kubenswrapper[4720]: I0121 14:52:36.593496 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:37 crc kubenswrapper[4720]: W0121 14:52:37.351862 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35639e0c_f3bb_48c7_9879_442aff2fcdbc.slice/crio-e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e WatchSource:0}: Error finding container e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e: Status 404 returned error can't find the container with id e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.356044 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.430882 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerStarted","Data":"e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e"} Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.712171 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849578 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849672 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849726 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.849789 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.850416 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.850904 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") pod \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\" (UID: \"532b4122-14d5-4dd2-84e5-f08c72a5c34e\") " Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.903861 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth" (OuterVolumeSpecName: "kube-api-access-ptkth") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "kube-api-access-ptkth". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.953205 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptkth\" (UniqueName: \"kubernetes.io/projected/532b4122-14d5-4dd2-84e5-f08c72a5c34e-kube-api-access-ptkth\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.969368 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.990537 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.990737 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:37 crc kubenswrapper[4720]: I0121 14:52:37.990975 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config" (OuterVolumeSpecName: "config") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.004823 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "532b4122-14d5-4dd2-84e5-f08c72a5c34e" (UID: "532b4122-14d5-4dd2-84e5-f08c72a5c34e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054608 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054689 4720 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054699 4720 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054709 4720 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.054716 4720 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/532b4122-14d5-4dd2-84e5-f08c72a5c34e-config\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.460380 4720 generic.go:334] "Generic (PLEG): container finished" podID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerID="514634cf38bfc52fe83d46810491a641041e45c3b3ded3080688a74aa08d5e9e" exitCode=0 Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.460505 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"514634cf38bfc52fe83d46810491a641041e45c3b3ded3080688a74aa08d5e9e"} Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.465792 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" event={"ID":"532b4122-14d5-4dd2-84e5-f08c72a5c34e","Type":"ContainerDied","Data":"116479b2273f15b257f5cd3bbc45cad56003eb6d06ec69e9ecf37fc87ad84fef"} Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.465843 4720 scope.go:117] "RemoveContainer" containerID="f9191da53845c4fb5f27b24bb54bbfc18c78bad73b8451c22c1b2afb7d84f650" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.465857 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-2dxkf" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.501629 4720 scope.go:117] "RemoveContainer" containerID="6b2d92d59d9fbd6ef4876eb39419107419a776f3afb8f7e157056e0cac869cb8" Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.508879 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.516514 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-2dxkf"] Jan 21 14:52:38 crc kubenswrapper[4720]: I0121 14:52:38.691936 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" path="/var/lib/kubelet/pods/532b4122-14d5-4dd2-84e5-f08c72a5c34e/volumes" Jan 21 14:52:40 crc kubenswrapper[4720]: I0121 14:52:40.486064 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerStarted","Data":"eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee"} Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.499157 4720 generic.go:334] "Generic (PLEG): container finished" podID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerID="eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee" exitCode=0 Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.499233 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee"} Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.888699 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt"] Jan 21 14:52:41 crc kubenswrapper[4720]: E0121 14:52:41.889116 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="init" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889134 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="init" Jan 21 14:52:41 crc kubenswrapper[4720]: E0121 14:52:41.889155 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889164 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889345 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="532b4122-14d5-4dd2-84e5-f08c72a5c34e" containerName="dnsmasq-dns" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.889891 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.894941 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.895393 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.895685 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.895975 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.919277 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt"] Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.928621 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.928924 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.929073 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:41 crc kubenswrapper[4720]: I0121 14:52:41.929280 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.031580 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.031975 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.032001 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.032109 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.037326 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.037463 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.037715 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.058505 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.213267 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.510217 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerStarted","Data":"b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007"} Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.536743 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5cdm8" podStartSLOduration=2.95468241 podStartE2EDuration="6.536721704s" podCreationTimestamp="2026-01-21 14:52:36 +0000 UTC" firstStartedPulling="2026-01-21 14:52:38.462279323 +0000 UTC m=+1396.371019255" lastFinishedPulling="2026-01-21 14:52:42.044318617 +0000 UTC m=+1399.953058549" observedRunningTime="2026-01-21 14:52:42.527043913 +0000 UTC m=+1400.435783865" watchObservedRunningTime="2026-01-21 14:52:42.536721704 +0000 UTC m=+1400.445461636" Jan 21 14:52:42 crc kubenswrapper[4720]: W0121 14:52:42.742061 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0506243d_6216_4541_8f14_8b2c2beb409b.slice/crio-de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc WatchSource:0}: Error finding container de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc: Status 404 returned error can't find the container with id de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc Jan 21 14:52:42 crc kubenswrapper[4720]: I0121 14:52:42.753587 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt"] Jan 21 14:52:43 crc kubenswrapper[4720]: I0121 14:52:43.519568 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerStarted","Data":"de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc"} Jan 21 14:52:45 crc kubenswrapper[4720]: I0121 14:52:45.542826 4720 generic.go:334] "Generic (PLEG): container finished" podID="f73dd82b-9ad1-4deb-b244-6d42a3f25f89" containerID="79b3aabb4928f6b631f8bd790a70c6b51e6a763cb8fc8dcce474163ba33400ba" exitCode=0 Jan 21 14:52:45 crc kubenswrapper[4720]: I0121 14:52:45.542966 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerDied","Data":"79b3aabb4928f6b631f8bd790a70c6b51e6a763cb8fc8dcce474163ba33400ba"} Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.553515 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f73dd82b-9ad1-4deb-b244-6d42a3f25f89","Type":"ContainerStarted","Data":"ab540f557245b3bfdd0325f392688f0c5633f332b617739e086b14de929c49c9"} Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.554036 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.554830 4720 generic.go:334] "Generic (PLEG): container finished" podID="4906b5ed-c663-4e81-ab33-2b8f33777cd1" containerID="18cfbfd34acd13af24c666d9d7a73718d1d3050e2f8b8e7529a2094d7947823e" exitCode=0 Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.554874 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerDied","Data":"18cfbfd34acd13af24c666d9d7a73718d1d3050e2f8b8e7529a2094d7947823e"} Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.590369 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.590347733 podStartE2EDuration="36.590347733s" podCreationTimestamp="2026-01-21 14:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:46.576725508 +0000 UTC m=+1404.485465450" watchObservedRunningTime="2026-01-21 14:52:46.590347733 +0000 UTC m=+1404.499087675" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.594325 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.594387 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:46 crc kubenswrapper[4720]: I0121 14:52:46.645977 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:47 crc kubenswrapper[4720]: I0121 14:52:47.617094 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:47 crc kubenswrapper[4720]: I0121 14:52:47.670153 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:49 crc kubenswrapper[4720]: I0121 14:52:49.580746 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5cdm8" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" containerID="cri-o://b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007" gracePeriod=2 Jan 21 14:52:50 crc kubenswrapper[4720]: I0121 14:52:50.595484 4720 generic.go:334] "Generic (PLEG): container finished" podID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerID="b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007" exitCode=0 Jan 21 14:52:50 crc kubenswrapper[4720]: I0121 14:52:50.595540 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007"} Jan 21 14:52:52 crc kubenswrapper[4720]: I0121 14:52:52.881786 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:52:52 crc kubenswrapper[4720]: I0121 14:52:52.882215 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.088095 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.185669 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") pod \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.185751 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") pod \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.185905 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") pod \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\" (UID: \"35639e0c-f3bb-48c7-9879-442aff2fcdbc\") " Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.186809 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities" (OuterVolumeSpecName: "utilities") pod "35639e0c-f3bb-48c7-9879-442aff2fcdbc" (UID: "35639e0c-f3bb-48c7-9879-442aff2fcdbc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.189455 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6" (OuterVolumeSpecName: "kube-api-access-g8hh6") pod "35639e0c-f3bb-48c7-9879-442aff2fcdbc" (UID: "35639e0c-f3bb-48c7-9879-442aff2fcdbc"). InnerVolumeSpecName "kube-api-access-g8hh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.235921 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35639e0c-f3bb-48c7-9879-442aff2fcdbc" (UID: "35639e0c-f3bb-48c7-9879-442aff2fcdbc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.287759 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8hh6\" (UniqueName: \"kubernetes.io/projected/35639e0c-f3bb-48c7-9879-442aff2fcdbc-kube-api-access-g8hh6\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.287797 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.287809 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35639e0c-f3bb-48c7-9879-442aff2fcdbc-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.632949 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerStarted","Data":"5133cb32808a6268b3ae340020a5ce8d3435cdf1af5aee0578ce67779a55b8e4"} Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.634876 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4906b5ed-c663-4e81-ab33-2b8f33777cd1","Type":"ContainerStarted","Data":"6306cc8c931fa6efdfe4271cdd0ca55e0e1479ce75a7982c399644765f122a3b"} Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.635071 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.636738 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5cdm8" event={"ID":"35639e0c-f3bb-48c7-9879-442aff2fcdbc","Type":"ContainerDied","Data":"e36a7ccb75bfaef518302425aaa5615316e673c41c0f4ff8d85f9079db99588e"} Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.636768 4720 scope.go:117] "RemoveContainer" containerID="b7a20252450813cdcecb54726afdb172a34e5c4adc5f07b8452b3e8875d4e007" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.636809 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5cdm8" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.656012 4720 scope.go:117] "RemoveContainer" containerID="eb22a6e0337b0f3e36f438e6a72843a04dd1a8862e6d1d19df010bf8b20418ee" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.658298 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" podStartSLOduration=2.500018223 podStartE2EDuration="12.658286675s" podCreationTimestamp="2026-01-21 14:52:41 +0000 UTC" firstStartedPulling="2026-01-21 14:52:42.745551778 +0000 UTC m=+1400.654291700" lastFinishedPulling="2026-01-21 14:52:52.90382022 +0000 UTC m=+1410.812560152" observedRunningTime="2026-01-21 14:52:53.649928302 +0000 UTC m=+1411.558668254" watchObservedRunningTime="2026-01-21 14:52:53.658286675 +0000 UTC m=+1411.567026607" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.682925 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.682910028 podStartE2EDuration="42.682910028s" podCreationTimestamp="2026-01-21 14:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 14:52:53.676641597 +0000 UTC m=+1411.585381529" watchObservedRunningTime="2026-01-21 14:52:53.682910028 +0000 UTC m=+1411.591649960" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.703837 4720 scope.go:117] "RemoveContainer" containerID="514634cf38bfc52fe83d46810491a641041e45c3b3ded3080688a74aa08d5e9e" Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.711625 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:53 crc kubenswrapper[4720]: I0121 14:52:53.727484 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5cdm8"] Jan 21 14:52:54 crc kubenswrapper[4720]: I0121 14:52:54.688784 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" path="/var/lib/kubelet/pods/35639e0c-f3bb-48c7-9879-442aff2fcdbc/volumes" Jan 21 14:53:00 crc kubenswrapper[4720]: I0121 14:53:00.646837 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 21 14:53:05 crc kubenswrapper[4720]: I0121 14:53:05.750319 4720 generic.go:334] "Generic (PLEG): container finished" podID="0506243d-6216-4541-8f14-8b2c2beb409b" containerID="5133cb32808a6268b3ae340020a5ce8d3435cdf1af5aee0578ce67779a55b8e4" exitCode=0 Jan 21 14:53:05 crc kubenswrapper[4720]: I0121 14:53:05.750446 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerDied","Data":"5133cb32808a6268b3ae340020a5ce8d3435cdf1af5aee0578ce67779a55b8e4"} Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.234904 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.360388 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.360773 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.360928 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.361101 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") pod \"0506243d-6216-4541-8f14-8b2c2beb409b\" (UID: \"0506243d-6216-4541-8f14-8b2c2beb409b\") " Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.366015 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4" (OuterVolumeSpecName: "kube-api-access-z4ff4") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "kube-api-access-z4ff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.366385 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.387960 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory" (OuterVolumeSpecName: "inventory") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.403483 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0506243d-6216-4541-8f14-8b2c2beb409b" (UID: "0506243d-6216-4541-8f14-8b2c2beb409b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.462989 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.463025 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4ff4\" (UniqueName: \"kubernetes.io/projected/0506243d-6216-4541-8f14-8b2c2beb409b-kube-api-access-z4ff4\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.463040 4720 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.463052 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0506243d-6216-4541-8f14-8b2c2beb409b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.771066 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" event={"ID":"0506243d-6216-4541-8f14-8b2c2beb409b","Type":"ContainerDied","Data":"de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc"} Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.771310 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de902a439b17ff2cb12404864917c3124a0c9d6ef8eb23dc1bc69b8a1f6ee5bc" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.771412 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.891255 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0506243d_6216_4541_8f14_8b2c2beb409b.slice\": RecentStats: unable to find data in memory cache]" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.897401 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6"] Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898092 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898115 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898145 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-utilities" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898154 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-utilities" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898185 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0506243d-6216-4541-8f14-8b2c2beb409b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898193 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="0506243d-6216-4541-8f14-8b2c2beb409b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: E0121 14:53:07.898231 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-content" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898240 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="extract-content" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898543 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="35639e0c-f3bb-48c7-9879-442aff2fcdbc" containerName="registry-server" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.898574 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="0506243d-6216-4541-8f14-8b2c2beb409b" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.899596 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.909336 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.909867 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.910114 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.910439 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:53:07 crc kubenswrapper[4720]: I0121 14:53:07.934606 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6"] Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075561 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075716 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075763 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.075807 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177031 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177278 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177322 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.177366 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.192079 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.192428 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.193309 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.194811 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.259809 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:53:08 crc kubenswrapper[4720]: I0121 14:53:08.832927 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6"] Jan 21 14:53:08 crc kubenswrapper[4720]: W0121 14:53:08.837862 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb96fb314_d163_41a0_b2b0_9a9c117d504c.slice/crio-6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9 WatchSource:0}: Error finding container 6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9: Status 404 returned error can't find the container with id 6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9 Jan 21 14:53:09 crc kubenswrapper[4720]: I0121 14:53:09.790487 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerStarted","Data":"66f01845f928a8df606e46c99755ec0f7e0b42c20c551824d6f9b7cd860dc1a5"} Jan 21 14:53:09 crc kubenswrapper[4720]: I0121 14:53:09.790998 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerStarted","Data":"6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9"} Jan 21 14:53:09 crc kubenswrapper[4720]: I0121 14:53:09.815668 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" podStartSLOduration=2.388559455 podStartE2EDuration="2.815634737s" podCreationTimestamp="2026-01-21 14:53:07 +0000 UTC" firstStartedPulling="2026-01-21 14:53:08.840965977 +0000 UTC m=+1426.749705919" lastFinishedPulling="2026-01-21 14:53:09.268041269 +0000 UTC m=+1427.176781201" observedRunningTime="2026-01-21 14:53:09.815528424 +0000 UTC m=+1427.724268376" watchObservedRunningTime="2026-01-21 14:53:09.815634737 +0000 UTC m=+1427.724374669" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.238821 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.240799 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.253487 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.334177 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.334478 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.334526 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.436608 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.436833 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.436874 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.437260 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.437316 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.460143 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"certified-operators-6vs9g\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.567915 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:11 crc kubenswrapper[4720]: I0121 14:53:11.666075 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.085421 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:12 crc kubenswrapper[4720]: W0121 14:53:12.086039 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3eca900_8aa2_4835_9864_c67e98b7172e.slice/crio-6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334 WatchSource:0}: Error finding container 6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334: Status 404 returned error can't find the container with id 6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334 Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.828850 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" exitCode=0 Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.829181 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1"} Jan 21 14:53:12 crc kubenswrapper[4720]: I0121 14:53:12.829215 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerStarted","Data":"6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334"} Jan 21 14:53:13 crc kubenswrapper[4720]: I0121 14:53:13.843902 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerStarted","Data":"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b"} Jan 21 14:53:15 crc kubenswrapper[4720]: I0121 14:53:15.886031 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" exitCode=0 Jan 21 14:53:15 crc kubenswrapper[4720]: I0121 14:53:15.886830 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b"} Jan 21 14:53:16 crc kubenswrapper[4720]: I0121 14:53:16.898432 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerStarted","Data":"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e"} Jan 21 14:53:16 crc kubenswrapper[4720]: I0121 14:53:16.927353 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6vs9g" podStartSLOduration=2.374680274 podStartE2EDuration="5.927334207s" podCreationTimestamp="2026-01-21 14:53:11 +0000 UTC" firstStartedPulling="2026-01-21 14:53:12.830881527 +0000 UTC m=+1430.739621459" lastFinishedPulling="2026-01-21 14:53:16.38353542 +0000 UTC m=+1434.292275392" observedRunningTime="2026-01-21 14:53:16.918573903 +0000 UTC m=+1434.827313845" watchObservedRunningTime="2026-01-21 14:53:16.927334207 +0000 UTC m=+1434.836074149" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.568503 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.569363 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.619260 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:21 crc kubenswrapper[4720]: I0121 14:53:21.978733 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:22 crc kubenswrapper[4720]: I0121 14:53:22.029494 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:22 crc kubenswrapper[4720]: I0121 14:53:22.880023 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:53:22 crc kubenswrapper[4720]: I0121 14:53:22.880115 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:53:23 crc kubenswrapper[4720]: I0121 14:53:23.952996 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6vs9g" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" containerID="cri-o://7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" gracePeriod=2 Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.431564 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.467258 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") pod \"d3eca900-8aa2-4835-9864-c67e98b7172e\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.467328 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") pod \"d3eca900-8aa2-4835-9864-c67e98b7172e\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.467459 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") pod \"d3eca900-8aa2-4835-9864-c67e98b7172e\" (UID: \"d3eca900-8aa2-4835-9864-c67e98b7172e\") " Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.468512 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities" (OuterVolumeSpecName: "utilities") pod "d3eca900-8aa2-4835-9864-c67e98b7172e" (UID: "d3eca900-8aa2-4835-9864-c67e98b7172e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.474376 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d" (OuterVolumeSpecName: "kube-api-access-7qn2d") pod "d3eca900-8aa2-4835-9864-c67e98b7172e" (UID: "d3eca900-8aa2-4835-9864-c67e98b7172e"). InnerVolumeSpecName "kube-api-access-7qn2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.517484 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d3eca900-8aa2-4835-9864-c67e98b7172e" (UID: "d3eca900-8aa2-4835-9864-c67e98b7172e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.569822 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.569866 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qn2d\" (UniqueName: \"kubernetes.io/projected/d3eca900-8aa2-4835-9864-c67e98b7172e-kube-api-access-7qn2d\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.569880 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d3eca900-8aa2-4835-9864-c67e98b7172e-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964497 4720 generic.go:334] "Generic (PLEG): container finished" podID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" exitCode=0 Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964536 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e"} Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964559 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vs9g" event={"ID":"d3eca900-8aa2-4835-9864-c67e98b7172e","Type":"ContainerDied","Data":"6ef6f8a8745a1fd25538e6a623a0173f200dba606dd9336f3b9cba1094620334"} Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964569 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vs9g" Jan 21 14:53:24 crc kubenswrapper[4720]: I0121 14:53:24.964577 4720 scope.go:117] "RemoveContainer" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.000440 4720 scope.go:117] "RemoveContainer" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.003755 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.013232 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6vs9g"] Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.031137 4720 scope.go:117] "RemoveContainer" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.070289 4720 scope.go:117] "RemoveContainer" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" Jan 21 14:53:25 crc kubenswrapper[4720]: E0121 14:53:25.070673 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e\": container with ID starting with 7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e not found: ID does not exist" containerID="7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.070699 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e"} err="failed to get container status \"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e\": rpc error: code = NotFound desc = could not find container \"7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e\": container with ID starting with 7f2326f414fdee3909bf2ae27cfa083a5fb3ce13aa7dafad0380c600b1dbab9e not found: ID does not exist" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.070719 4720 scope.go:117] "RemoveContainer" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" Jan 21 14:53:25 crc kubenswrapper[4720]: E0121 14:53:25.071222 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b\": container with ID starting with 8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b not found: ID does not exist" containerID="8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.071238 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b"} err="failed to get container status \"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b\": rpc error: code = NotFound desc = could not find container \"8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b\": container with ID starting with 8bc1514448df4c980dea663a33ee6c0eab594d105324df98f1363d892be25a1b not found: ID does not exist" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.071253 4720 scope.go:117] "RemoveContainer" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" Jan 21 14:53:25 crc kubenswrapper[4720]: E0121 14:53:25.071471 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1\": container with ID starting with fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1 not found: ID does not exist" containerID="fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1" Jan 21 14:53:25 crc kubenswrapper[4720]: I0121 14:53:25.071487 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1"} err="failed to get container status \"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1\": rpc error: code = NotFound desc = could not find container \"fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1\": container with ID starting with fb9dab8606bf46f6282ba5f500ddd93bc21258fdf3daaffde6a10aad43740dd1 not found: ID does not exist" Jan 21 14:53:26 crc kubenswrapper[4720]: I0121 14:53:26.690382 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" path="/var/lib/kubelet/pods/d3eca900-8aa2-4835-9864-c67e98b7172e/volumes" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.879616 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.880429 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.880493 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.881694 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:53:52 crc kubenswrapper[4720]: I0121 14:53:52.881807 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f" gracePeriod=600 Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214467 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f" exitCode=0 Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214551 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f"} Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827"} Jan 21 14:53:53 crc kubenswrapper[4720]: I0121 14:53:53.214786 4720 scope.go:117] "RemoveContainer" containerID="c955510d9d72215d99901afe6e11ff00ee6cb8f0d5290256bae37e29e3631aa6" Jan 21 14:53:57 crc kubenswrapper[4720]: I0121 14:53:57.967798 4720 scope.go:117] "RemoveContainer" containerID="cc3e9052ef84997a09ae1c29fb5eed4fd4dc22153bc67325317d7b50498a93b9" Jan 21 14:53:57 crc kubenswrapper[4720]: I0121 14:53:57.995872 4720 scope.go:117] "RemoveContainer" containerID="bc4beac3df68c3a4d150eba1728e09c2fdcdca24969df6e9d7185b1713f0ae4f" Jan 21 14:53:58 crc kubenswrapper[4720]: I0121 14:53:58.061380 4720 scope.go:117] "RemoveContainer" containerID="7fda82afe9e25635d25bfab63eae235397df92725d98016475c28391c7bd5687" Jan 21 14:54:58 crc kubenswrapper[4720]: I0121 14:54:58.191572 4720 scope.go:117] "RemoveContainer" containerID="59fd91b37bfcd11f4ff497c598ac3f209fb0f59dbb3d22d1cb6e9955f559e0d1" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.254138 4720 scope.go:117] "RemoveContainer" containerID="fe8f0f865bcfbac8500256bd0011d0f3321a6c7dc7b1a223783f54471eacf3d7" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.280059 4720 scope.go:117] "RemoveContainer" containerID="4a49a0860cc34aea77152e63d7f2664cc0101f7fb517d16de1561d7724f281fe" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.299724 4720 scope.go:117] "RemoveContainer" containerID="09f3932991cc54223a102a084d56fb2a6013a3367824ff852a79aaab841c7c9d" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.317629 4720 scope.go:117] "RemoveContainer" containerID="162849d5232970e8a4f401d6bc1bb8b7acd38c2e6e26bea6a8783902f8ef0d61" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.335518 4720 scope.go:117] "RemoveContainer" containerID="2656b4700e21f0c4fe6d2a6022d5d04628debe20176c13e5a7ff671b4ef6cfd2" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.352400 4720 scope.go:117] "RemoveContainer" containerID="80caf00710e62f883afdacc5c1851e84288e4f97b1d4abfb6840c04d5e1f8db2" Jan 21 14:55:58 crc kubenswrapper[4720]: I0121 14:55:58.370252 4720 scope.go:117] "RemoveContainer" containerID="7864be8ab599e4e4751d80908f487622fb05f60c6fa32cf32ec247ac04ec10ee" Jan 21 14:56:22 crc kubenswrapper[4720]: I0121 14:56:22.884234 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:56:22 crc kubenswrapper[4720]: I0121 14:56:22.885155 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:56:39 crc kubenswrapper[4720]: I0121 14:56:39.946351 4720 generic.go:334] "Generic (PLEG): container finished" podID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerID="66f01845f928a8df606e46c99755ec0f7e0b42c20c551824d6f9b7cd860dc1a5" exitCode=0 Jan 21 14:56:39 crc kubenswrapper[4720]: I0121 14:56:39.946454 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerDied","Data":"66f01845f928a8df606e46c99755ec0f7e0b42c20c551824d6f9b7cd860dc1a5"} Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.337878 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.527904 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.528103 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.528133 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.528150 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") pod \"b96fb314-d163-41a0-b2b0-9a9c117d504c\" (UID: \"b96fb314-d163-41a0-b2b0-9a9c117d504c\") " Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.659546 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.659647 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn" (OuterVolumeSpecName: "kube-api-access-qtctn") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "kube-api-access-qtctn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.664186 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.664441 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory" (OuterVolumeSpecName: "inventory") pod "b96fb314-d163-41a0-b2b0-9a9c117d504c" (UID: "b96fb314-d163-41a0-b2b0-9a9c117d504c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731773 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtctn\" (UniqueName: \"kubernetes.io/projected/b96fb314-d163-41a0-b2b0-9a9c117d504c-kube-api-access-qtctn\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731809 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731819 4720 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.731830 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b96fb314-d163-41a0-b2b0-9a9c117d504c-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.970914 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" event={"ID":"b96fb314-d163-41a0-b2b0-9a9c117d504c","Type":"ContainerDied","Data":"6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9"} Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.970964 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d879920ef670137abce52863a1d8186ab37a8e794f34f027602532736f19ca9" Jan 21 14:56:41 crc kubenswrapper[4720]: I0121 14:56:41.971077 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.088304 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct"] Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089903 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089923 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089951 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-content" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089958 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-content" Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089968 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-utilities" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089974 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="extract-utilities" Jan 21 14:56:42 crc kubenswrapper[4720]: E0121 14:56:42.089994 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.089999 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.090187 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3eca900-8aa2-4835-9864-c67e98b7172e" containerName="registry-server" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.090203 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="b96fb314-d163-41a0-b2b0-9a9c117d504c" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.090831 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093247 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093487 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093636 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.093798 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.103235 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct"] Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.239956 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.240096 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.240157 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.342037 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.342148 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.342192 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.346803 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.347434 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.369590 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-xdsct\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:42 crc kubenswrapper[4720]: I0121 14:56:42.417564 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.030523 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct"] Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.035895 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.991486 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerStarted","Data":"e468cd7143756cc451fdd7913ff8db25bab6eb6bfa9003d3cf6cdc7970cd5c98"} Jan 21 14:56:43 crc kubenswrapper[4720]: I0121 14:56:43.991849 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerStarted","Data":"a9b310bef0977ddb0861adfd9f34f1856719d1601a607a8d4c42e05b686fefc0"} Jan 21 14:56:44 crc kubenswrapper[4720]: I0121 14:56:44.010753 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" podStartSLOduration=1.493850815 podStartE2EDuration="2.010732775s" podCreationTimestamp="2026-01-21 14:56:42 +0000 UTC" firstStartedPulling="2026-01-21 14:56:43.03493696 +0000 UTC m=+1640.943676892" lastFinishedPulling="2026-01-21 14:56:43.55181892 +0000 UTC m=+1641.460558852" observedRunningTime="2026-01-21 14:56:44.010357616 +0000 UTC m=+1641.919097548" watchObservedRunningTime="2026-01-21 14:56:44.010732775 +0000 UTC m=+1641.919472727" Jan 21 14:56:52 crc kubenswrapper[4720]: I0121 14:56:52.879527 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:56:52 crc kubenswrapper[4720]: I0121 14:56:52.880037 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:56:58 crc kubenswrapper[4720]: I0121 14:56:58.424100 4720 scope.go:117] "RemoveContainer" containerID="0c9026115552582579bf8c91de9fceb499e94a991e4d85938cd66f4935bb22d8" Jan 21 14:56:59 crc kubenswrapper[4720]: I0121 14:56:59.048255 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:56:59 crc kubenswrapper[4720]: I0121 14:56:59.056110 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ckgkh"] Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.032198 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.045963 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-372a-account-create-update-w4xkf"] Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.691788 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d0385ad-a123-4c46-a96f-652dee1f89cd" path="/var/lib/kubelet/pods/0d0385ad-a123-4c46-a96f-652dee1f89cd/volumes" Jan 21 14:57:00 crc kubenswrapper[4720]: I0121 14:57:00.693470 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bb55ed-9214-4f25-8740-ac50421baa4b" path="/var/lib/kubelet/pods/b4bb55ed-9214-4f25-8740-ac50421baa4b/volumes" Jan 21 14:57:04 crc kubenswrapper[4720]: I0121 14:57:04.039136 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:57:04 crc kubenswrapper[4720]: I0121 14:57:04.048729 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mcz8g"] Jan 21 14:57:04 crc kubenswrapper[4720]: I0121 14:57:04.697293 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290dffa3-ed33-4571-aeb1-092aae1d8105" path="/var/lib/kubelet/pods/290dffa3-ed33-4571-aeb1-092aae1d8105/volumes" Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.040723 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.049201 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.059805 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-55njq"] Jan 21 14:57:05 crc kubenswrapper[4720]: I0121 14:57:05.072825 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-318a-account-create-update-lkf6p"] Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.025096 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.034165 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-06a3-account-create-update-dbk66"] Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.688612 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f0b95b-6621-43fe-93c2-d4e7704f1f61" path="/var/lib/kubelet/pods/49f0b95b-6621-43fe-93c2-d4e7704f1f61/volumes" Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.689768 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8161ded5-d8ab-48b7-9c1a-16a7155641d1" path="/var/lib/kubelet/pods/8161ded5-d8ab-48b7-9c1a-16a7155641d1/volumes" Jan 21 14:57:06 crc kubenswrapper[4720]: I0121 14:57:06.690512 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fbe0fa-0158-480f-9f6d-2d589da3b91e" path="/var/lib/kubelet/pods/a4fbe0fa-0158-480f-9f6d-2d589da3b91e/volumes" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.042466 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.051487 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-89dxv"] Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.688818 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc2d647-37b6-4437-98fc-1d95af05cfe0" path="/var/lib/kubelet/pods/1fc2d647-37b6-4437-98fc-1d95af05cfe0/volumes" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.880147 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.880207 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.880261 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.881198 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 14:57:22 crc kubenswrapper[4720]: I0121 14:57:22.881578 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" gracePeriod=600 Jan 21 14:57:23 crc kubenswrapper[4720]: E0121 14:57:23.001166 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.316530 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" exitCode=0 Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.316579 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827"} Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.316617 4720 scope.go:117] "RemoveContainer" containerID="4a11e96f70bc2887e543718b48f5cffe20ea9e02702421d54bac9042ee7fd65f" Jan 21 14:57:23 crc kubenswrapper[4720]: I0121 14:57:23.317360 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:57:23 crc kubenswrapper[4720]: E0121 14:57:23.317713 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:36 crc kubenswrapper[4720]: I0121 14:57:36.679022 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:57:36 crc kubenswrapper[4720]: E0121 14:57:36.680359 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:39 crc kubenswrapper[4720]: I0121 14:57:39.035439 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:57:39 crc kubenswrapper[4720]: I0121 14:57:39.052706 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-dtj5w"] Jan 21 14:57:40 crc kubenswrapper[4720]: I0121 14:57:40.689344 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c40c650e-a05e-4cc0-88fa-d56eae92d29a" path="/var/lib/kubelet/pods/c40c650e-a05e-4cc0-88fa-d56eae92d29a/volumes" Jan 21 14:57:43 crc kubenswrapper[4720]: I0121 14:57:43.044672 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:57:43 crc kubenswrapper[4720]: I0121 14:57:43.061583 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-qjpx9"] Jan 21 14:57:44 crc kubenswrapper[4720]: I0121 14:57:44.692620 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077e6634-d42f-4765-ab65-9e24cf21a047" path="/var/lib/kubelet/pods/077e6634-d42f-4765-ab65-9e24cf21a047/volumes" Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.039054 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.059333 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.073080 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-9105-account-create-update-h4nvp"] Jan 21 14:57:47 crc kubenswrapper[4720]: I0121 14:57:47.084765 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-43f1-account-create-update-bsqmb"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.035176 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.045204 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.055457 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b1e4-account-create-update-qtmr9"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.062982 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pmrgf"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.068918 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.074717 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-md2wm"] Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.678362 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:57:48 crc kubenswrapper[4720]: E0121 14:57:48.678623 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.693573 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffa29ff-07bd-40cc-9853-a484f79b382f" path="/var/lib/kubelet/pods/5ffa29ff-07bd-40cc-9853-a484f79b382f/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.694336 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6545ddce-5b65-4702-9dee-2f2d9644123e" path="/var/lib/kubelet/pods/6545ddce-5b65-4702-9dee-2f2d9644123e/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.695123 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82f9f1ca-7fe3-4e17-8393-20364149010d" path="/var/lib/kubelet/pods/82f9f1ca-7fe3-4e17-8393-20364149010d/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.695821 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8da5c3a6-e588-412a-b884-7875fe439e61" path="/var/lib/kubelet/pods/8da5c3a6-e588-412a-b884-7875fe439e61/volumes" Jan 21 14:57:48 crc kubenswrapper[4720]: I0121 14:57:48.697304 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a4204b-d91a-4d30-bea2-c327b452b61a" path="/var/lib/kubelet/pods/d3a4204b-d91a-4d30-bea2-c327b452b61a/volumes" Jan 21 14:57:52 crc kubenswrapper[4720]: I0121 14:57:52.034168 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:57:52 crc kubenswrapper[4720]: I0121 14:57:52.051148 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-f47pm"] Jan 21 14:57:52 crc kubenswrapper[4720]: I0121 14:57:52.693985 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17" path="/var/lib/kubelet/pods/3ecd5ffd-b54b-4f40-9cef-5ea7aabb6a17/volumes" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.477014 4720 scope.go:117] "RemoveContainer" containerID="41516602ff1ad171062abf2d068bab3f3ef63d954e1d46d8ab67f0a5722b61e9" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.529042 4720 scope.go:117] "RemoveContainer" containerID="7a64ef6d780ce73bbcb9b4e47639e6c2751ab6b42a36ab32810d2bb3c4c85044" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.555340 4720 scope.go:117] "RemoveContainer" containerID="307cb2943833035f93ad418790abe5b99a637ac449640923f1bf4d797ef693c9" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.594386 4720 scope.go:117] "RemoveContainer" containerID="8d5a885edcd4e22f1c2c16df333a61bd50d3383f3347aa464336e86a726533ed" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.642352 4720 scope.go:117] "RemoveContainer" containerID="abbb759ffaf221d0c9f8ed807f7987c4931c0626f086cc661e603dcc248f4947" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.673843 4720 scope.go:117] "RemoveContainer" containerID="3b809ce73b12339e4bd569ef93ddc354e0352255b11435e3d7ab7be025d1d6b0" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.709349 4720 scope.go:117] "RemoveContainer" containerID="77c8d16617de72e209afb71532a20278f4f6ca3c8ddea5a94d98282960f81a1c" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.781933 4720 scope.go:117] "RemoveContainer" containerID="b58fbfdd95d5a162cfec3d9e246f4a009ac8953ff289afaa9f7d6970293702c0" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.803787 4720 scope.go:117] "RemoveContainer" containerID="13d168c727b9d26f6f7317f1e362696e169d6ec9bb3d6175c527decee022cc0f" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.820086 4720 scope.go:117] "RemoveContainer" containerID="93fd560224a5890696cb0b97a0caeb546a3a0f6e334fb8c0f1cfda08ff3cdbe7" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.836586 4720 scope.go:117] "RemoveContainer" containerID="5d1f9a2280c4b827ded3a73860cfbf132b529c55e0547c798c884373c0113797" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.853302 4720 scope.go:117] "RemoveContainer" containerID="bbdc74de2b9aa9d89088725acd4c82b08706e4b50492cfbb262eba1e6a3ade4a" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.874326 4720 scope.go:117] "RemoveContainer" containerID="dae0e28936bcc6f5956c6eab724975a72ae35869b387709c9280dc4e17738181" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.900389 4720 scope.go:117] "RemoveContainer" containerID="f16aaabb5619940ea1f57988c30451dc484e4600daff1551a784f8d03b34d96d" Jan 21 14:57:58 crc kubenswrapper[4720]: I0121 14:57:58.916878 4720 scope.go:117] "RemoveContainer" containerID="0acbc31567e50b57eafcd661f7415e473d40a8ea1039c09546c667b2852b3e5b" Jan 21 14:57:59 crc kubenswrapper[4720]: I0121 14:57:59.662299 4720 generic.go:334] "Generic (PLEG): container finished" podID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerID="e468cd7143756cc451fdd7913ff8db25bab6eb6bfa9003d3cf6cdc7970cd5c98" exitCode=0 Jan 21 14:57:59 crc kubenswrapper[4720]: I0121 14:57:59.662459 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerDied","Data":"e468cd7143756cc451fdd7913ff8db25bab6eb6bfa9003d3cf6cdc7970cd5c98"} Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.075898 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.147736 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") pod \"7e4bbdff-6382-41c7-a054-bb15c6923e32\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.147994 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") pod \"7e4bbdff-6382-41c7-a054-bb15c6923e32\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.148058 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") pod \"7e4bbdff-6382-41c7-a054-bb15c6923e32\" (UID: \"7e4bbdff-6382-41c7-a054-bb15c6923e32\") " Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.152727 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v" (OuterVolumeSpecName: "kube-api-access-9tw9v") pod "7e4bbdff-6382-41c7-a054-bb15c6923e32" (UID: "7e4bbdff-6382-41c7-a054-bb15c6923e32"). InnerVolumeSpecName "kube-api-access-9tw9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.175685 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory" (OuterVolumeSpecName: "inventory") pod "7e4bbdff-6382-41c7-a054-bb15c6923e32" (UID: "7e4bbdff-6382-41c7-a054-bb15c6923e32"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.180269 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e4bbdff-6382-41c7-a054-bb15c6923e32" (UID: "7e4bbdff-6382-41c7-a054-bb15c6923e32"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.249874 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.249909 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tw9v\" (UniqueName: \"kubernetes.io/projected/7e4bbdff-6382-41c7-a054-bb15c6923e32-kube-api-access-9tw9v\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.249919 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e4bbdff-6382-41c7-a054-bb15c6923e32-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.682040 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" event={"ID":"7e4bbdff-6382-41c7-a054-bb15c6923e32","Type":"ContainerDied","Data":"a9b310bef0977ddb0861adfd9f34f1856719d1601a607a8d4c42e05b686fefc0"} Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.682086 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b310bef0977ddb0861adfd9f34f1856719d1601a607a8d4c42e05b686fefc0" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.682145 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-xdsct" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.795166 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd"] Jan 21 14:58:01 crc kubenswrapper[4720]: E0121 14:58:01.795533 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.795552 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.795735 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e4bbdff-6382-41c7-a054-bb15c6923e32" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.796307 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799532 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799538 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799817 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.799903 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.808957 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd"] Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.871289 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.871538 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.871842 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.973623 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.974075 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.974130 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.980164 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.983133 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:01 crc kubenswrapper[4720]: I0121 14:58:01.993417 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.148929 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.653641 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd"] Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.689322 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:02 crc kubenswrapper[4720]: E0121 14:58:02.689675 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:02 crc kubenswrapper[4720]: I0121 14:58:02.704964 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerStarted","Data":"85968b6c8a32e79bef64859e9cbfa0fb3f67e8e1d38a94743b27db0c0bab4ed8"} Jan 21 14:58:03 crc kubenswrapper[4720]: I0121 14:58:03.716621 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerStarted","Data":"73061a839b8863ce34641584fb62efcd992e6515e42bf1c74ff8dba240765a88"} Jan 21 14:58:03 crc kubenswrapper[4720]: I0121 14:58:03.768438 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" podStartSLOduration=2.250700711 podStartE2EDuration="2.768418323s" podCreationTimestamp="2026-01-21 14:58:01 +0000 UTC" firstStartedPulling="2026-01-21 14:58:02.662760091 +0000 UTC m=+1720.571500023" lastFinishedPulling="2026-01-21 14:58:03.180477683 +0000 UTC m=+1721.089217635" observedRunningTime="2026-01-21 14:58:03.734839569 +0000 UTC m=+1721.643579541" watchObservedRunningTime="2026-01-21 14:58:03.768418323 +0000 UTC m=+1721.677158255" Jan 21 14:58:08 crc kubenswrapper[4720]: I0121 14:58:08.769946 4720 generic.go:334] "Generic (PLEG): container finished" podID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerID="73061a839b8863ce34641584fb62efcd992e6515e42bf1c74ff8dba240765a88" exitCode=0 Jan 21 14:58:08 crc kubenswrapper[4720]: I0121 14:58:08.770038 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerDied","Data":"73061a839b8863ce34641584fb62efcd992e6515e42bf1c74ff8dba240765a88"} Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.224139 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.233193 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") pod \"1708e39a-582c-42e2-8c2e-d71fef75a183\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.233326 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") pod \"1708e39a-582c-42e2-8c2e-d71fef75a183\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.233557 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") pod \"1708e39a-582c-42e2-8c2e-d71fef75a183\" (UID: \"1708e39a-582c-42e2-8c2e-d71fef75a183\") " Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.241703 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5" (OuterVolumeSpecName: "kube-api-access-2hgk5") pod "1708e39a-582c-42e2-8c2e-d71fef75a183" (UID: "1708e39a-582c-42e2-8c2e-d71fef75a183"). InnerVolumeSpecName "kube-api-access-2hgk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.270168 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1708e39a-582c-42e2-8c2e-d71fef75a183" (UID: "1708e39a-582c-42e2-8c2e-d71fef75a183"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.270943 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory" (OuterVolumeSpecName: "inventory") pod "1708e39a-582c-42e2-8c2e-d71fef75a183" (UID: "1708e39a-582c-42e2-8c2e-d71fef75a183"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.335539 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.335567 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1708e39a-582c-42e2-8c2e-d71fef75a183-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.335576 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hgk5\" (UniqueName: \"kubernetes.io/projected/1708e39a-582c-42e2-8c2e-d71fef75a183-kube-api-access-2hgk5\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.790217 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" event={"ID":"1708e39a-582c-42e2-8c2e-d71fef75a183","Type":"ContainerDied","Data":"85968b6c8a32e79bef64859e9cbfa0fb3f67e8e1d38a94743b27db0c0bab4ed8"} Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.790268 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85968b6c8a32e79bef64859e9cbfa0fb3f67e8e1d38a94743b27db0c0bab4ed8" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.790345 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.892437 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg"] Jan 21 14:58:10 crc kubenswrapper[4720]: E0121 14:58:10.892835 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.892858 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.893077 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="1708e39a-582c-42e2-8c2e-d71fef75a183" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.893814 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.903067 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg"] Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.934119 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.934340 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.934544 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.938309 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.961750 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.961806 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:10 crc kubenswrapper[4720]: I0121 14:58:10.961846 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.063846 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.064437 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.064631 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.069502 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.070555 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.079966 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-rwbjg\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.262183 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:11 crc kubenswrapper[4720]: I0121 14:58:11.787188 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg"] Jan 21 14:58:12 crc kubenswrapper[4720]: I0121 14:58:12.827182 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerStarted","Data":"84b7c1102ed8905c57b99200f382f83610f148178a176ed251bd4301f0f84e8b"} Jan 21 14:58:12 crc kubenswrapper[4720]: I0121 14:58:12.827505 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerStarted","Data":"aed7e6e3d2bc12e37696131e0b0339af7c33a47dc639b79b4f4ace583ab25aeb"} Jan 21 14:58:12 crc kubenswrapper[4720]: I0121 14:58:12.854708 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" podStartSLOduration=2.424295347 podStartE2EDuration="2.854687183s" podCreationTimestamp="2026-01-21 14:58:10 +0000 UTC" firstStartedPulling="2026-01-21 14:58:11.811067972 +0000 UTC m=+1729.719807904" lastFinishedPulling="2026-01-21 14:58:12.241459758 +0000 UTC m=+1730.150199740" observedRunningTime="2026-01-21 14:58:12.845387518 +0000 UTC m=+1730.754127470" watchObservedRunningTime="2026-01-21 14:58:12.854687183 +0000 UTC m=+1730.763427135" Jan 21 14:58:17 crc kubenswrapper[4720]: I0121 14:58:17.678923 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:17 crc kubenswrapper[4720]: E0121 14:58:17.679897 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:20 crc kubenswrapper[4720]: I0121 14:58:20.049831 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:58:20 crc kubenswrapper[4720]: I0121 14:58:20.063398 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fhvrr"] Jan 21 14:58:20 crc kubenswrapper[4720]: I0121 14:58:20.689587 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a6c6de6-8f88-4c87-bd8e-46579996948e" path="/var/lib/kubelet/pods/7a6c6de6-8f88-4c87-bd8e-46579996948e/volumes" Jan 21 14:58:22 crc kubenswrapper[4720]: I0121 14:58:22.048093 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:58:22 crc kubenswrapper[4720]: I0121 14:58:22.060259 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fh44q"] Jan 21 14:58:22 crc kubenswrapper[4720]: I0121 14:58:22.695027 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72a4a042-08eb-4644-81c0-2cfcd105cf2b" path="/var/lib/kubelet/pods/72a4a042-08eb-4644-81c0-2cfcd105cf2b/volumes" Jan 21 14:58:31 crc kubenswrapper[4720]: I0121 14:58:31.043202 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:58:31 crc kubenswrapper[4720]: I0121 14:58:31.060730 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lsn2k"] Jan 21 14:58:31 crc kubenswrapper[4720]: I0121 14:58:31.678623 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:31 crc kubenswrapper[4720]: E0121 14:58:31.678922 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:32 crc kubenswrapper[4720]: I0121 14:58:32.688315 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e400cd-53d2-4738-96f0-75829e339879" path="/var/lib/kubelet/pods/03e400cd-53d2-4738-96f0-75829e339879/volumes" Jan 21 14:58:38 crc kubenswrapper[4720]: I0121 14:58:38.030711 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:58:38 crc kubenswrapper[4720]: I0121 14:58:38.039779 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wtr5d"] Jan 21 14:58:38 crc kubenswrapper[4720]: I0121 14:58:38.687541 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eaf7930-34cf-4396-9b94-c09d3a5da09a" path="/var/lib/kubelet/pods/2eaf7930-34cf-4396-9b94-c09d3a5da09a/volumes" Jan 21 14:58:39 crc kubenswrapper[4720]: I0121 14:58:39.030998 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:58:39 crc kubenswrapper[4720]: I0121 14:58:39.041916 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-vz5k2"] Jan 21 14:58:40 crc kubenswrapper[4720]: I0121 14:58:40.698041 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d468a637-b18d-47fd-9b04-910dba72a955" path="/var/lib/kubelet/pods/d468a637-b18d-47fd-9b04-910dba72a955/volumes" Jan 21 14:58:44 crc kubenswrapper[4720]: I0121 14:58:44.678621 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:44 crc kubenswrapper[4720]: E0121 14:58:44.679384 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:57 crc kubenswrapper[4720]: I0121 14:58:57.678422 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:58:57 crc kubenswrapper[4720]: E0121 14:58:57.680206 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:58:58 crc kubenswrapper[4720]: I0121 14:58:58.211770 4720 generic.go:334] "Generic (PLEG): container finished" podID="5c493941-48f3-4a3e-a66a-4f045487005e" containerID="84b7c1102ed8905c57b99200f382f83610f148178a176ed251bd4301f0f84e8b" exitCode=0 Jan 21 14:58:58 crc kubenswrapper[4720]: I0121 14:58:58.212263 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerDied","Data":"84b7c1102ed8905c57b99200f382f83610f148178a176ed251bd4301f0f84e8b"} Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.132340 4720 scope.go:117] "RemoveContainer" containerID="e157ae31f96b07ea02b29f98dae94eb3d7d5795415a495d65698b9a5085c7130" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.199943 4720 scope.go:117] "RemoveContainer" containerID="aa36f5e3e3dbee78955e3cde60ca553a839782ae810aaa8a755ce96f2d298234" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.265168 4720 scope.go:117] "RemoveContainer" containerID="da30657364957537118b3484996473e61293d2c96c58d296138cfcceba62bd38" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.306921 4720 scope.go:117] "RemoveContainer" containerID="4cbcc32aeb798aaa7b0d77c7b3bd3ce53ec4708a0626d5329246f49a64fe4d07" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.358121 4720 scope.go:117] "RemoveContainer" containerID="4e24f13f0ad5e20e473814aa465820a276b981501127e963947f9007b3bccb91" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.553539 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.632858 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") pod \"5c493941-48f3-4a3e-a66a-4f045487005e\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.632951 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") pod \"5c493941-48f3-4a3e-a66a-4f045487005e\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.633004 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") pod \"5c493941-48f3-4a3e-a66a-4f045487005e\" (UID: \"5c493941-48f3-4a3e-a66a-4f045487005e\") " Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.639321 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn" (OuterVolumeSpecName: "kube-api-access-56fqn") pod "5c493941-48f3-4a3e-a66a-4f045487005e" (UID: "5c493941-48f3-4a3e-a66a-4f045487005e"). InnerVolumeSpecName "kube-api-access-56fqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.657703 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory" (OuterVolumeSpecName: "inventory") pod "5c493941-48f3-4a3e-a66a-4f045487005e" (UID: "5c493941-48f3-4a3e-a66a-4f045487005e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.663520 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5c493941-48f3-4a3e-a66a-4f045487005e" (UID: "5c493941-48f3-4a3e-a66a-4f045487005e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.735289 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.735329 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fqn\" (UniqueName: \"kubernetes.io/projected/5c493941-48f3-4a3e-a66a-4f045487005e-kube-api-access-56fqn\") on node \"crc\" DevicePath \"\"" Jan 21 14:58:59 crc kubenswrapper[4720]: I0121 14:58:59.735343 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5c493941-48f3-4a3e-a66a-4f045487005e-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.258541 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" event={"ID":"5c493941-48f3-4a3e-a66a-4f045487005e","Type":"ContainerDied","Data":"aed7e6e3d2bc12e37696131e0b0339af7c33a47dc639b79b4f4ace583ab25aeb"} Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.258888 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aed7e6e3d2bc12e37696131e0b0339af7c33a47dc639b79b4f4ace583ab25aeb" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.258587 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-rwbjg" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.317640 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9"] Jan 21 14:59:00 crc kubenswrapper[4720]: E0121 14:59:00.318012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c493941-48f3-4a3e-a66a-4f045487005e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.318029 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c493941-48f3-4a3e-a66a-4f045487005e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.318222 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c493941-48f3-4a3e-a66a-4f045487005e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.318785 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.320955 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.325128 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.325282 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.325360 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.333240 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9"] Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.455600 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.455967 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.456104 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.557415 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.557497 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.557568 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.563871 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.563945 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.574207 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:00 crc kubenswrapper[4720]: I0121 14:59:00.638636 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:01 crc kubenswrapper[4720]: I0121 14:59:01.167075 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9"] Jan 21 14:59:01 crc kubenswrapper[4720]: I0121 14:59:01.267728 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerStarted","Data":"0b0ae62ba40bf3a5bedeb2c32f16a9363d168087cad8a0ff8c897eda0877c576"} Jan 21 14:59:02 crc kubenswrapper[4720]: I0121 14:59:02.276954 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerStarted","Data":"0c95cea8482b0819d7a21f58cf98e8eeec4346801cb75dd588a65b0836fd1afe"} Jan 21 14:59:02 crc kubenswrapper[4720]: I0121 14:59:02.301007 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" podStartSLOduration=1.655538593 podStartE2EDuration="2.300986759s" podCreationTimestamp="2026-01-21 14:59:00 +0000 UTC" firstStartedPulling="2026-01-21 14:59:01.178130043 +0000 UTC m=+1779.086869995" lastFinishedPulling="2026-01-21 14:59:01.823578229 +0000 UTC m=+1779.732318161" observedRunningTime="2026-01-21 14:59:02.297879409 +0000 UTC m=+1780.206619361" watchObservedRunningTime="2026-01-21 14:59:02.300986759 +0000 UTC m=+1780.209726691" Jan 21 14:59:06 crc kubenswrapper[4720]: I0121 14:59:06.310713 4720 generic.go:334] "Generic (PLEG): container finished" podID="09ee2ae5-f10a-4080-90df-29c01525e871" containerID="0c95cea8482b0819d7a21f58cf98e8eeec4346801cb75dd588a65b0836fd1afe" exitCode=0 Jan 21 14:59:06 crc kubenswrapper[4720]: I0121 14:59:06.311771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerDied","Data":"0c95cea8482b0819d7a21f58cf98e8eeec4346801cb75dd588a65b0836fd1afe"} Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.312548 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.335013 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" event={"ID":"09ee2ae5-f10a-4080-90df-29c01525e871","Type":"ContainerDied","Data":"0b0ae62ba40bf3a5bedeb2c32f16a9363d168087cad8a0ff8c897eda0877c576"} Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.335052 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0ae62ba40bf3a5bedeb2c32f16a9363d168087cad8a0ff8c897eda0877c576" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.335106 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.415527 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") pod \"09ee2ae5-f10a-4080-90df-29c01525e871\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.415611 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") pod \"09ee2ae5-f10a-4080-90df-29c01525e871\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.415644 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") pod \"09ee2ae5-f10a-4080-90df-29c01525e871\" (UID: \"09ee2ae5-f10a-4080-90df-29c01525e871\") " Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.427224 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv" (OuterVolumeSpecName: "kube-api-access-n4kwv") pod "09ee2ae5-f10a-4080-90df-29c01525e871" (UID: "09ee2ae5-f10a-4080-90df-29c01525e871"). InnerVolumeSpecName "kube-api-access-n4kwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.435194 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq"] Jan 21 14:59:08 crc kubenswrapper[4720]: E0121 14:59:08.435552 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ee2ae5-f10a-4080-90df-29c01525e871" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.435567 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ee2ae5-f10a-4080-90df-29c01525e871" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.435810 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ee2ae5-f10a-4080-90df-29c01525e871" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.436566 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.484589 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory" (OuterVolumeSpecName: "inventory") pod "09ee2ae5-f10a-4080-90df-29c01525e871" (UID: "09ee2ae5-f10a-4080-90df-29c01525e871"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.488990 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "09ee2ae5-f10a-4080-90df-29c01525e871" (UID: "09ee2ae5-f10a-4080-90df-29c01525e871"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.490719 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq"] Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.520718 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4kwv\" (UniqueName: \"kubernetes.io/projected/09ee2ae5-f10a-4080-90df-29c01525e871-kube-api-access-n4kwv\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.520752 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.520761 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09ee2ae5-f10a-4080-90df-29c01525e871-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.622310 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.622362 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.622392 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.723670 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.723709 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.723733 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.729178 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.729422 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.740985 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:08 crc kubenswrapper[4720]: I0121 14:59:08.866422 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 14:59:10 crc kubenswrapper[4720]: I0121 14:59:10.001151 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq"] Jan 21 14:59:10 crc kubenswrapper[4720]: I0121 14:59:10.352128 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerStarted","Data":"e5d1100794969fb7a90ec0d0a1822e837ef096202c22d88f476ed2c30b64dd65"} Jan 21 14:59:11 crc kubenswrapper[4720]: I0121 14:59:11.361155 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerStarted","Data":"15632d2747d70d997cd9421ae03a766ab0f8b8e86525dad4d08ea842212ff453"} Jan 21 14:59:11 crc kubenswrapper[4720]: I0121 14:59:11.387576 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" podStartSLOduration=3.005119009 podStartE2EDuration="3.38755669s" podCreationTimestamp="2026-01-21 14:59:08 +0000 UTC" firstStartedPulling="2026-01-21 14:59:10.014227718 +0000 UTC m=+1787.922967650" lastFinishedPulling="2026-01-21 14:59:10.396665399 +0000 UTC m=+1788.305405331" observedRunningTime="2026-01-21 14:59:11.381217568 +0000 UTC m=+1789.289957530" watchObservedRunningTime="2026-01-21 14:59:11.38755669 +0000 UTC m=+1789.296296622" Jan 21 14:59:11 crc kubenswrapper[4720]: I0121 14:59:11.678318 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:11 crc kubenswrapper[4720]: E0121 14:59:11.678589 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:23 crc kubenswrapper[4720]: I0121 14:59:23.678530 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:23 crc kubenswrapper[4720]: E0121 14:59:23.679169 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.054696 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.073270 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.082722 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.095492 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.105248 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-c5zqd"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.114506 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.123866 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-62k9x"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.131863 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.139333 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d9b2-account-create-update-dld7b"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.145230 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-61ab-account-create-update-4mch7"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.150755 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b472-account-create-update-cmqsp"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.156324 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-99kbn"] Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.692912 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f8146d-b3dd-48a4-b1a8-9fa590c0d808" path="/var/lib/kubelet/pods/01f8146d-b3dd-48a4-b1a8-9fa590c0d808/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.693832 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08abcad-85f1-431b-853e-3599eebed756" path="/var/lib/kubelet/pods/a08abcad-85f1-431b-853e-3599eebed756/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.694750 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12f971e-bd5e-4b60-9d28-06c786d852ae" path="/var/lib/kubelet/pods/a12f971e-bd5e-4b60-9d28-06c786d852ae/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.695562 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad73ec2f-ba76-4451-8202-33403a41de12" path="/var/lib/kubelet/pods/ad73ec2f-ba76-4451-8202-33403a41de12/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.697621 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af31d5e0-11e6-433b-a31e-bea14d7e5c95" path="/var/lib/kubelet/pods/af31d5e0-11e6-433b-a31e-bea14d7e5c95/volumes" Jan 21 14:59:32 crc kubenswrapper[4720]: I0121 14:59:32.698864 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cf579e-cb45-4984-8558-107b9576d977" path="/var/lib/kubelet/pods/d9cf579e-cb45-4984-8558-107b9576d977/volumes" Jan 21 14:59:36 crc kubenswrapper[4720]: I0121 14:59:36.679183 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:36 crc kubenswrapper[4720]: E0121 14:59:36.680108 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:51 crc kubenswrapper[4720]: I0121 14:59:51.678694 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 14:59:51 crc kubenswrapper[4720]: E0121 14:59:51.680921 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.517001 4720 scope.go:117] "RemoveContainer" containerID="3c91133a01a4614de36a8d666a7d07c7ef46c013dcc30aab91a584e4c3f9d821" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.550203 4720 scope.go:117] "RemoveContainer" containerID="72ca4e3efda677c6d5505c06f76f801874dedd82499a86395269317817d91b41" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.591168 4720 scope.go:117] "RemoveContainer" containerID="640739b09d2283081f0c3b2a06de0e2de45e7dd328c1f454ca1fe542c003fad9" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.630705 4720 scope.go:117] "RemoveContainer" containerID="76e56b7b117cebd65fc0e8a56b27da7c2b84bd042ac8ed9b2babbcdfb78864a1" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.674345 4720 scope.go:117] "RemoveContainer" containerID="8c3eb39f9b9627b072a3900c90555cd68e5d7daab86658e513ca3c054e6b4044" Jan 21 14:59:59 crc kubenswrapper[4720]: I0121 14:59:59.710074 4720 scope.go:117] "RemoveContainer" containerID="582c2f5a67c5087ceb2090b4f845673a61d252b5b4bb8a1030a72f2c63755ab3" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.152520 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn"] Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.153598 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.156846 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.171685 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn"] Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.191674 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.312986 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.313039 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.313076 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.415022 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.415079 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.415127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.416289 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.421093 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.431109 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"collect-profiles-29483460-95ltn\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.514890 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:00 crc kubenswrapper[4720]: I0121 15:00:00.996358 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn"] Jan 21 15:00:01 crc kubenswrapper[4720]: I0121 15:00:01.789457 4720 generic.go:334] "Generic (PLEG): container finished" podID="d92d32a0-256b-4078-a4cf-fe678205141c" containerID="5eabcb934e6e2604ac38974d36efe8a72af780fa2ad0365a3b5f182a6ce58b8c" exitCode=0 Jan 21 15:00:01 crc kubenswrapper[4720]: I0121 15:00:01.789566 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" event={"ID":"d92d32a0-256b-4078-a4cf-fe678205141c","Type":"ContainerDied","Data":"5eabcb934e6e2604ac38974d36efe8a72af780fa2ad0365a3b5f182a6ce58b8c"} Jan 21 15:00:01 crc kubenswrapper[4720]: I0121 15:00:01.789970 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" event={"ID":"d92d32a0-256b-4078-a4cf-fe678205141c","Type":"ContainerStarted","Data":"d40250c1e89baaaad3a8ab3d072992e3f510588ed754eadf4ed15205c79738a6"} Jan 21 15:00:02 crc kubenswrapper[4720]: I0121 15:00:02.686024 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:02 crc kubenswrapper[4720]: E0121 15:00:02.688922 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.092928 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.172350 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") pod \"d92d32a0-256b-4078-a4cf-fe678205141c\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.172480 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") pod \"d92d32a0-256b-4078-a4cf-fe678205141c\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.172594 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") pod \"d92d32a0-256b-4078-a4cf-fe678205141c\" (UID: \"d92d32a0-256b-4078-a4cf-fe678205141c\") " Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.173751 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume" (OuterVolumeSpecName: "config-volume") pod "d92d32a0-256b-4078-a4cf-fe678205141c" (UID: "d92d32a0-256b-4078-a4cf-fe678205141c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.180986 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz" (OuterVolumeSpecName: "kube-api-access-cvltz") pod "d92d32a0-256b-4078-a4cf-fe678205141c" (UID: "d92d32a0-256b-4078-a4cf-fe678205141c"). InnerVolumeSpecName "kube-api-access-cvltz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.184833 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d92d32a0-256b-4078-a4cf-fe678205141c" (UID: "d92d32a0-256b-4078-a4cf-fe678205141c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.274510 4720 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d92d32a0-256b-4078-a4cf-fe678205141c-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.274554 4720 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d92d32a0-256b-4078-a4cf-fe678205141c-config-volume\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.274565 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvltz\" (UniqueName: \"kubernetes.io/projected/d92d32a0-256b-4078-a4cf-fe678205141c-kube-api-access-cvltz\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.805347 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" event={"ID":"d92d32a0-256b-4078-a4cf-fe678205141c","Type":"ContainerDied","Data":"d40250c1e89baaaad3a8ab3d072992e3f510588ed754eadf4ed15205c79738a6"} Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.805630 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d40250c1e89baaaad3a8ab3d072992e3f510588ed754eadf4ed15205c79738a6" Jan 21 15:00:03 crc kubenswrapper[4720]: I0121 15:00:03.805420 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29483460-95ltn" Jan 21 15:00:09 crc kubenswrapper[4720]: I0121 15:00:09.039547 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 15:00:09 crc kubenswrapper[4720]: I0121 15:00:09.052162 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-vm954"] Jan 21 15:00:10 crc kubenswrapper[4720]: I0121 15:00:10.025099 4720 generic.go:334] "Generic (PLEG): container finished" podID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerID="15632d2747d70d997cd9421ae03a766ab0f8b8e86525dad4d08ea842212ff453" exitCode=0 Jan 21 15:00:10 crc kubenswrapper[4720]: I0121 15:00:10.025135 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerDied","Data":"15632d2747d70d997cd9421ae03a766ab0f8b8e86525dad4d08ea842212ff453"} Jan 21 15:00:10 crc kubenswrapper[4720]: I0121 15:00:10.690463 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dda8050-939a-4a64-b119-b718b60c7887" path="/var/lib/kubelet/pods/4dda8050-939a-4a64-b119-b718b60c7887/volumes" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.451996 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.574420 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") pod \"5e910d6d-e1c9-447a-9584-0338f9151f26\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.574726 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") pod \"5e910d6d-e1c9-447a-9584-0338f9151f26\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.574871 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") pod \"5e910d6d-e1c9-447a-9584-0338f9151f26\" (UID: \"5e910d6d-e1c9-447a-9584-0338f9151f26\") " Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.582836 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5" (OuterVolumeSpecName: "kube-api-access-k4gq5") pod "5e910d6d-e1c9-447a-9584-0338f9151f26" (UID: "5e910d6d-e1c9-447a-9584-0338f9151f26"). InnerVolumeSpecName "kube-api-access-k4gq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.599607 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory" (OuterVolumeSpecName: "inventory") pod "5e910d6d-e1c9-447a-9584-0338f9151f26" (UID: "5e910d6d-e1c9-447a-9584-0338f9151f26"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.602351 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e910d6d-e1c9-447a-9584-0338f9151f26" (UID: "5e910d6d-e1c9-447a-9584-0338f9151f26"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.677478 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4gq5\" (UniqueName: \"kubernetes.io/projected/5e910d6d-e1c9-447a-9584-0338f9151f26-kube-api-access-k4gq5\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.677530 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:11 crc kubenswrapper[4720]: I0121 15:00:11.677543 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e910d6d-e1c9-447a-9584-0338f9151f26-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.045354 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" event={"ID":"5e910d6d-e1c9-447a-9584-0338f9151f26","Type":"ContainerDied","Data":"e5d1100794969fb7a90ec0d0a1822e837ef096202c22d88f476ed2c30b64dd65"} Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.045394 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.045402 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5d1100794969fb7a90ec0d0a1822e837ef096202c22d88f476ed2c30b64dd65" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.193241 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4ngb6"] Jan 21 15:00:12 crc kubenswrapper[4720]: E0121 15:00:12.194077 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194127 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: E0121 15:00:12.194204 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92d32a0-256b-4078-a4cf-fe678205141c" containerName="collect-profiles" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194218 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92d32a0-256b-4078-a4cf-fe678205141c" containerName="collect-profiles" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194580 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92d32a0-256b-4078-a4cf-fe678205141c" containerName="collect-profiles" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.194646 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e910d6d-e1c9-447a-9584-0338f9151f26" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.195948 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.198205 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.198589 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.198731 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.199203 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.227754 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4ngb6"] Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.288624 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.288956 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.289045 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.390631 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.390698 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.390822 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.395775 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.399716 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.420241 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"ssh-known-hosts-edpm-deployment-4ngb6\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:12 crc kubenswrapper[4720]: I0121 15:00:12.549534 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:13 crc kubenswrapper[4720]: I0121 15:00:13.116285 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-4ngb6"] Jan 21 15:00:14 crc kubenswrapper[4720]: I0121 15:00:14.060671 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerStarted","Data":"cc3d37e454530494f6e10cb0c5c4a654b028edb59941ddc672e9fae6eb52eaa2"} Jan 21 15:00:14 crc kubenswrapper[4720]: I0121 15:00:14.060961 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerStarted","Data":"7f4f49ba21a269201b2f1754840f5d53404770afb029a06e3b0473599195c10e"} Jan 21 15:00:14 crc kubenswrapper[4720]: I0121 15:00:14.092161 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" podStartSLOduration=1.6255253939999998 podStartE2EDuration="2.092143049s" podCreationTimestamp="2026-01-21 15:00:12 +0000 UTC" firstStartedPulling="2026-01-21 15:00:13.124935763 +0000 UTC m=+1851.033675695" lastFinishedPulling="2026-01-21 15:00:13.591553418 +0000 UTC m=+1851.500293350" observedRunningTime="2026-01-21 15:00:14.086048794 +0000 UTC m=+1851.994788726" watchObservedRunningTime="2026-01-21 15:00:14.092143049 +0000 UTC m=+1852.000883001" Jan 21 15:00:16 crc kubenswrapper[4720]: I0121 15:00:16.678870 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:16 crc kubenswrapper[4720]: E0121 15:00:16.679975 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:21 crc kubenswrapper[4720]: I0121 15:00:21.113609 4720 generic.go:334] "Generic (PLEG): container finished" podID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerID="cc3d37e454530494f6e10cb0c5c4a654b028edb59941ddc672e9fae6eb52eaa2" exitCode=0 Jan 21 15:00:21 crc kubenswrapper[4720]: I0121 15:00:21.113679 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerDied","Data":"cc3d37e454530494f6e10cb0c5c4a654b028edb59941ddc672e9fae6eb52eaa2"} Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.497458 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.567801 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") pod \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.567878 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") pod \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.568603 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") pod \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\" (UID: \"d64c2129-c3c8-4f00-ac2f-750094e2ea79\") " Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.573179 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq" (OuterVolumeSpecName: "kube-api-access-kfvdq") pod "d64c2129-c3c8-4f00-ac2f-750094e2ea79" (UID: "d64c2129-c3c8-4f00-ac2f-750094e2ea79"). InnerVolumeSpecName "kube-api-access-kfvdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.592844 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d64c2129-c3c8-4f00-ac2f-750094e2ea79" (UID: "d64c2129-c3c8-4f00-ac2f-750094e2ea79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.600178 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d64c2129-c3c8-4f00-ac2f-750094e2ea79" (UID: "d64c2129-c3c8-4f00-ac2f-750094e2ea79"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.670428 4720 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.670698 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d64c2129-c3c8-4f00-ac2f-750094e2ea79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:22 crc kubenswrapper[4720]: I0121 15:00:22.670773 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfvdq\" (UniqueName: \"kubernetes.io/projected/d64c2129-c3c8-4f00-ac2f-750094e2ea79-kube-api-access-kfvdq\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.132729 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" event={"ID":"d64c2129-c3c8-4f00-ac2f-750094e2ea79","Type":"ContainerDied","Data":"7f4f49ba21a269201b2f1754840f5d53404770afb029a06e3b0473599195c10e"} Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.132770 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4f49ba21a269201b2f1754840f5d53404770afb029a06e3b0473599195c10e" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.132808 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-4ngb6" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.207926 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr"] Jan 21 15:00:23 crc kubenswrapper[4720]: E0121 15:00:23.208428 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerName="ssh-known-hosts-edpm-deployment" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.208457 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerName="ssh-known-hosts-edpm-deployment" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.208749 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64c2129-c3c8-4f00-ac2f-750094e2ea79" containerName="ssh-known-hosts-edpm-deployment" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.209433 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.213347 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.213387 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.213515 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.215077 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.218082 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr"] Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.385540 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.385599 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.385647 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.488247 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.488708 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.488809 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.494864 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.496312 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.505624 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bzgnr\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:23 crc kubenswrapper[4720]: I0121 15:00:23.527279 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:24 crc kubenswrapper[4720]: I0121 15:00:24.027614 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr"] Jan 21 15:00:24 crc kubenswrapper[4720]: I0121 15:00:24.140580 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerStarted","Data":"402afb865f19cb021e17133dd2b9ffeda4950d35fe675a6d3caff6a308ea31af"} Jan 21 15:00:25 crc kubenswrapper[4720]: I0121 15:00:25.151341 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerStarted","Data":"e84f3f24ab3050fdfead3ecc7d62f7a95baea3ef007470fbf033b189c03a29a3"} Jan 21 15:00:25 crc kubenswrapper[4720]: I0121 15:00:25.182879 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" podStartSLOduration=1.7642144960000001 podStartE2EDuration="2.182854033s" podCreationTimestamp="2026-01-21 15:00:23 +0000 UTC" firstStartedPulling="2026-01-21 15:00:24.031008296 +0000 UTC m=+1861.939748228" lastFinishedPulling="2026-01-21 15:00:24.449647823 +0000 UTC m=+1862.358387765" observedRunningTime="2026-01-21 15:00:25.173082363 +0000 UTC m=+1863.081822295" watchObservedRunningTime="2026-01-21 15:00:25.182854033 +0000 UTC m=+1863.091593975" Jan 21 15:00:30 crc kubenswrapper[4720]: I0121 15:00:30.679108 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:30 crc kubenswrapper[4720]: E0121 15:00:30.679732 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:33 crc kubenswrapper[4720]: I0121 15:00:33.210969 4720 generic.go:334] "Generic (PLEG): container finished" podID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerID="e84f3f24ab3050fdfead3ecc7d62f7a95baea3ef007470fbf033b189c03a29a3" exitCode=0 Jan 21 15:00:33 crc kubenswrapper[4720]: I0121 15:00:33.211082 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerDied","Data":"e84f3f24ab3050fdfead3ecc7d62f7a95baea3ef007470fbf033b189c03a29a3"} Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.044126 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.054904 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jcm9t"] Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.639038 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.689368 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57a2637-15ee-4c59-881b-9364ffde9ffc" path="/var/lib/kubelet/pods/b57a2637-15ee-4c59-881b-9364ffde9ffc/volumes" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.723434 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") pod \"595ce90e-f537-4d7f-be8f-a4da40103ab1\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.723509 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") pod \"595ce90e-f537-4d7f-be8f-a4da40103ab1\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.723550 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") pod \"595ce90e-f537-4d7f-be8f-a4da40103ab1\" (UID: \"595ce90e-f537-4d7f-be8f-a4da40103ab1\") " Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.732942 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7" (OuterVolumeSpecName: "kube-api-access-68lf7") pod "595ce90e-f537-4d7f-be8f-a4da40103ab1" (UID: "595ce90e-f537-4d7f-be8f-a4da40103ab1"). InnerVolumeSpecName "kube-api-access-68lf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.753888 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory" (OuterVolumeSpecName: "inventory") pod "595ce90e-f537-4d7f-be8f-a4da40103ab1" (UID: "595ce90e-f537-4d7f-be8f-a4da40103ab1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.754306 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "595ce90e-f537-4d7f-be8f-a4da40103ab1" (UID: "595ce90e-f537-4d7f-be8f-a4da40103ab1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.825259 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.825546 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/595ce90e-f537-4d7f-be8f-a4da40103ab1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:34 crc kubenswrapper[4720]: I0121 15:00:34.825636 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68lf7\" (UniqueName: \"kubernetes.io/projected/595ce90e-f537-4d7f-be8f-a4da40103ab1-kube-api-access-68lf7\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.229115 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" event={"ID":"595ce90e-f537-4d7f-be8f-a4da40103ab1","Type":"ContainerDied","Data":"402afb865f19cb021e17133dd2b9ffeda4950d35fe675a6d3caff6a308ea31af"} Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.229170 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402afb865f19cb021e17133dd2b9ffeda4950d35fe675a6d3caff6a308ea31af" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.229173 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bzgnr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.315753 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr"] Jan 21 15:00:35 crc kubenswrapper[4720]: E0121 15:00:35.318371 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.318397 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.318614 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="595ce90e-f537-4d7f-be8f-a4da40103ab1" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.319445 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.324207 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.324528 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q5rkp" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.324645 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.326370 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.331576 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr"] Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.332979 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.333077 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.333165 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.434271 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.434309 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.434424 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.446544 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.454988 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.463949 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:35 crc kubenswrapper[4720]: I0121 15:00:35.696274 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:36 crc kubenswrapper[4720]: I0121 15:00:36.260591 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr"] Jan 21 15:00:37 crc kubenswrapper[4720]: I0121 15:00:37.243483 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerStarted","Data":"80cadf6f3bc6b22ed4aba2049e195881690691403abe700c83d2951f830a8f6f"} Jan 21 15:00:37 crc kubenswrapper[4720]: I0121 15:00:37.243801 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerStarted","Data":"d68a727dd916e43d3a8c2cfa254ca196f5227a82a509c28c3cfdb668d54f6731"} Jan 21 15:00:37 crc kubenswrapper[4720]: I0121 15:00:37.262382 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" podStartSLOduration=1.802790232 podStartE2EDuration="2.262359645s" podCreationTimestamp="2026-01-21 15:00:35 +0000 UTC" firstStartedPulling="2026-01-21 15:00:36.259490547 +0000 UTC m=+1874.168230479" lastFinishedPulling="2026-01-21 15:00:36.71905996 +0000 UTC m=+1874.627799892" observedRunningTime="2026-01-21 15:00:37.257951431 +0000 UTC m=+1875.166691373" watchObservedRunningTime="2026-01-21 15:00:37.262359645 +0000 UTC m=+1875.171099597" Jan 21 15:00:38 crc kubenswrapper[4720]: I0121 15:00:38.041613 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 15:00:38 crc kubenswrapper[4720]: I0121 15:00:38.043058 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwmwq"] Jan 21 15:00:38 crc kubenswrapper[4720]: I0121 15:00:38.689148 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f23517c-a9a1-4740-8b3b-d42b40cc8bc7" path="/var/lib/kubelet/pods/6f23517c-a9a1-4740-8b3b-d42b40cc8bc7/volumes" Jan 21 15:00:43 crc kubenswrapper[4720]: I0121 15:00:43.678705 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:43 crc kubenswrapper[4720]: E0121 15:00:43.679222 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:47 crc kubenswrapper[4720]: I0121 15:00:47.326020 4720 generic.go:334] "Generic (PLEG): container finished" podID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerID="80cadf6f3bc6b22ed4aba2049e195881690691403abe700c83d2951f830a8f6f" exitCode=0 Jan 21 15:00:47 crc kubenswrapper[4720]: I0121 15:00:47.326089 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerDied","Data":"80cadf6f3bc6b22ed4aba2049e195881690691403abe700c83d2951f830a8f6f"} Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.738184 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.869165 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") pod \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.869304 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") pod \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.869339 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") pod \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\" (UID: \"64e0dfca-6b74-47c9-8f6f-76de697cf3e0\") " Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.875802 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j" (OuterVolumeSpecName: "kube-api-access-6mf8j") pod "64e0dfca-6b74-47c9-8f6f-76de697cf3e0" (UID: "64e0dfca-6b74-47c9-8f6f-76de697cf3e0"). InnerVolumeSpecName "kube-api-access-6mf8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.900993 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory" (OuterVolumeSpecName: "inventory") pod "64e0dfca-6b74-47c9-8f6f-76de697cf3e0" (UID: "64e0dfca-6b74-47c9-8f6f-76de697cf3e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.910080 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "64e0dfca-6b74-47c9-8f6f-76de697cf3e0" (UID: "64e0dfca-6b74-47c9-8f6f-76de697cf3e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.971859 4720 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-inventory\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.971892 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mf8j\" (UniqueName: \"kubernetes.io/projected/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-kube-api-access-6mf8j\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:48 crc kubenswrapper[4720]: I0121 15:00:48.971904 4720 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64e0dfca-6b74-47c9-8f6f-76de697cf3e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 21 15:00:49 crc kubenswrapper[4720]: I0121 15:00:49.341470 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" event={"ID":"64e0dfca-6b74-47c9-8f6f-76de697cf3e0","Type":"ContainerDied","Data":"d68a727dd916e43d3a8c2cfa254ca196f5227a82a509c28c3cfdb668d54f6731"} Jan 21 15:00:49 crc kubenswrapper[4720]: I0121 15:00:49.341505 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d68a727dd916e43d3a8c2cfa254ca196f5227a82a509c28c3cfdb668d54f6731" Jan 21 15:00:49 crc kubenswrapper[4720]: I0121 15:00:49.341511 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr" Jan 21 15:00:57 crc kubenswrapper[4720]: I0121 15:00:57.679584 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:00:57 crc kubenswrapper[4720]: E0121 15:00:57.680371 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:00:59 crc kubenswrapper[4720]: I0121 15:00:59.831215 4720 scope.go:117] "RemoveContainer" containerID="0cf3fdd52f65dc4830c6503325bd2251a454cdee26406e44f66cb14c6ec26e1c" Jan 21 15:00:59 crc kubenswrapper[4720]: I0121 15:00:59.875028 4720 scope.go:117] "RemoveContainer" containerID="1cacd08b92a88ab371232f39ef9e5865d3573d5d8458ae4746910cd77bac3530" Jan 21 15:00:59 crc kubenswrapper[4720]: I0121 15:00:59.934991 4720 scope.go:117] "RemoveContainer" containerID="48dd6f7a9d23c8b16a78f67df238aa1196e7c893c560fa3cacb0f6b87e00728a" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.151331 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29483461-qxsqs"] Jan 21 15:01:00 crc kubenswrapper[4720]: E0121 15:01:00.152012 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.152129 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.152466 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e0dfca-6b74-47c9-8f6f-76de697cf3e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.153113 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.166335 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483461-qxsqs"] Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273345 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273439 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273494 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.273529 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375060 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375128 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375177 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.375277 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.381523 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.384456 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.386114 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.394276 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"keystone-cron-29483461-qxsqs\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.475004 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:00 crc kubenswrapper[4720]: W0121 15:01:00.920402 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d585381_d477_4c8d_af17_6194044b6de1.slice/crio-7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913 WatchSource:0}: Error finding container 7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913: Status 404 returned error can't find the container with id 7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913 Jan 21 15:01:00 crc kubenswrapper[4720]: I0121 15:01:00.921742 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29483461-qxsqs"] Jan 21 15:01:01 crc kubenswrapper[4720]: I0121 15:01:01.460182 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerStarted","Data":"41ed3f5ba3eb7241bb1a9185ebb40c52b90ca854816e7caaa743d314a0bd5e57"} Jan 21 15:01:01 crc kubenswrapper[4720]: I0121 15:01:01.460558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerStarted","Data":"7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913"} Jan 21 15:01:01 crc kubenswrapper[4720]: I0121 15:01:01.503735 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29483461-qxsqs" podStartSLOduration=1.5037143670000002 podStartE2EDuration="1.503714367s" podCreationTimestamp="2026-01-21 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-21 15:01:01.496669295 +0000 UTC m=+1899.405409247" watchObservedRunningTime="2026-01-21 15:01:01.503714367 +0000 UTC m=+1899.412454309" Jan 21 15:01:03 crc kubenswrapper[4720]: I0121 15:01:03.477024 4720 generic.go:334] "Generic (PLEG): container finished" podID="4d585381-d477-4c8d-af17-6194044b6de1" containerID="41ed3f5ba3eb7241bb1a9185ebb40c52b90ca854816e7caaa743d314a0bd5e57" exitCode=0 Jan 21 15:01:03 crc kubenswrapper[4720]: I0121 15:01:03.477121 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerDied","Data":"41ed3f5ba3eb7241bb1a9185ebb40c52b90ca854816e7caaa743d314a0bd5e57"} Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.803400 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859283 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859510 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859541 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.859629 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") pod \"4d585381-d477-4c8d-af17-6194044b6de1\" (UID: \"4d585381-d477-4c8d-af17-6194044b6de1\") " Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.873899 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.878039 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr" (OuterVolumeSpecName: "kube-api-access-q5vbr") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "kube-api-access-q5vbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.899054 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.925173 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data" (OuterVolumeSpecName: "config-data") pod "4d585381-d477-4c8d-af17-6194044b6de1" (UID: "4d585381-d477-4c8d-af17-6194044b6de1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961375 4720 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961408 4720 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961417 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5vbr\" (UniqueName: \"kubernetes.io/projected/4d585381-d477-4c8d-af17-6194044b6de1-kube-api-access-q5vbr\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:04 crc kubenswrapper[4720]: I0121 15:01:04.961428 4720 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d585381-d477-4c8d-af17-6194044b6de1-config-data\") on node \"crc\" DevicePath \"\"" Jan 21 15:01:05 crc kubenswrapper[4720]: I0121 15:01:05.499772 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29483461-qxsqs" event={"ID":"4d585381-d477-4c8d-af17-6194044b6de1","Type":"ContainerDied","Data":"7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913"} Jan 21 15:01:05 crc kubenswrapper[4720]: I0121 15:01:05.500037 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e20e2b9220b40c392054b123d87a20955becc669e8cb584e42c590030312913" Jan 21 15:01:05 crc kubenswrapper[4720]: I0121 15:01:05.499867 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29483461-qxsqs" Jan 21 15:01:11 crc kubenswrapper[4720]: I0121 15:01:11.678367 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:11 crc kubenswrapper[4720]: E0121 15:01:11.679381 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:20 crc kubenswrapper[4720]: I0121 15:01:20.065347 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 15:01:20 crc kubenswrapper[4720]: I0121 15:01:20.077221 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7qf47"] Jan 21 15:01:20 crc kubenswrapper[4720]: I0121 15:01:20.694363 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8fc07ed-67cb-4459-b7cb-ea8101ea4317" path="/var/lib/kubelet/pods/d8fc07ed-67cb-4459-b7cb-ea8101ea4317/volumes" Jan 21 15:01:25 crc kubenswrapper[4720]: I0121 15:01:25.679074 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:25 crc kubenswrapper[4720]: E0121 15:01:25.679637 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.742157 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:01:29 crc kubenswrapper[4720]: E0121 15:01:29.742755 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d585381-d477-4c8d-af17-6194044b6de1" containerName="keystone-cron" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.742941 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d585381-d477-4c8d-af17-6194044b6de1" containerName="keystone-cron" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.743090 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d585381-d477-4c8d-af17-6194044b6de1" containerName="keystone-cron" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.743949 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.747373 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ltcrl"/"kube-root-ca.crt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.747878 4720 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ltcrl"/"openshift-service-ca.crt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.768284 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.831453 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.831615 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.933161 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.933307 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.933731 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:29 crc kubenswrapper[4720]: I0121 15:01:29.957354 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"must-gather-kz6gt\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:30 crc kubenswrapper[4720]: I0121 15:01:30.064607 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:01:30 crc kubenswrapper[4720]: I0121 15:01:30.623174 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:01:30 crc kubenswrapper[4720]: W0121 15:01:30.630714 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ba91fa_9395_4dae_8bf6_384541b2d3ed.slice/crio-616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a WatchSource:0}: Error finding container 616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a: Status 404 returned error can't find the container with id 616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a Jan 21 15:01:30 crc kubenswrapper[4720]: I0121 15:01:30.705669 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerStarted","Data":"616a1c463942b8844a85522f6f0d3f6da17e8a9b8d20809c560cf5484f4cde3a"} Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.245272 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.247915 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.280550 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.336860 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.336929 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.337065 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.438935 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.439002 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.439096 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.439891 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.440150 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.460047 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"redhat-operators-6x2nt\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:34 crc kubenswrapper[4720]: I0121 15:01:34.582801 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.274678 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.800558 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerStarted","Data":"c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.800604 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerStarted","Data":"1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.803289 4720 generic.go:334] "Generic (PLEG): container finished" podID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" exitCode=0 Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.803331 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.803353 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerStarted","Data":"50874e76ea4107b7c07d6f1ccee98b03d59ab44fcfb2f73925bdd79450642dc9"} Jan 21 15:01:39 crc kubenswrapper[4720]: I0121 15:01:39.833678 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" podStartSLOduration=2.596554255 podStartE2EDuration="10.833646403s" podCreationTimestamp="2026-01-21 15:01:29 +0000 UTC" firstStartedPulling="2026-01-21 15:01:30.634254829 +0000 UTC m=+1928.542994761" lastFinishedPulling="2026-01-21 15:01:38.871346977 +0000 UTC m=+1936.780086909" observedRunningTime="2026-01-21 15:01:39.830067114 +0000 UTC m=+1937.738807056" watchObservedRunningTime="2026-01-21 15:01:39.833646403 +0000 UTC m=+1937.742386335" Jan 21 15:01:40 crc kubenswrapper[4720]: I0121 15:01:40.678705 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:40 crc kubenswrapper[4720]: E0121 15:01:40.679897 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:41 crc kubenswrapper[4720]: I0121 15:01:41.824849 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerStarted","Data":"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d"} Jan 21 15:01:42 crc kubenswrapper[4720]: E0121 15:01:42.520583 4720 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.103:60486->38.102.83.103:43429: read tcp 38.102.83.103:60486->38.102.83.103:43429: read: connection reset by peer Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.212554 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-fblvn"] Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.214052 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.215863 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ltcrl"/"default-dockercfg-bztpl" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.357632 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.357768 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.459851 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.459932 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.460089 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.480250 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"crc-debug-fblvn\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.533608 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:01:43 crc kubenswrapper[4720]: W0121 15:01:43.563823 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b489c8d_aa41_41cf_a984_9479eda75544.slice/crio-51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830 WatchSource:0}: Error finding container 51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830: Status 404 returned error can't find the container with id 51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830 Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.566419 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:01:43 crc kubenswrapper[4720]: I0121 15:01:43.841000 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" event={"ID":"4b489c8d-aa41-41cf-a984-9479eda75544","Type":"ContainerStarted","Data":"51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830"} Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.865228 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f448c69d6-sjp2r_3b177763-3020-4854-b45a-43d99221c670/barbican-api-log/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.870222 4720 generic.go:334] "Generic (PLEG): container finished" podID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" exitCode=0 Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.870261 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d"} Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.884414 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5f448c69d6-sjp2r_3b177763-3020-4854-b45a-43d99221c670/barbican-api/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.961157 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6898c4b994-dn9qn_bb475766-6891-454b-8f7e-1494d9806891/barbican-keystone-listener-log/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.968419 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6898c4b994-dn9qn_bb475766-6891-454b-8f7e-1494d9806891/barbican-keystone-listener/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.979860 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8f88c9d47-m5rzn_9355d502-bf01-4465-996d-483d99b92954/barbican-worker-log/0.log" Jan 21 15:01:46 crc kubenswrapper[4720]: I0121 15:01:46.989338 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-8f88c9d47-m5rzn_9355d502-bf01-4465-996d-483d99b92954/barbican-worker/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.056711 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-gqmj6_b96fb314-d163-41a0-b2b0-9a9c117d504c/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.101413 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/ceilometer-central-agent/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.125667 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/ceilometer-notification-agent/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.133134 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/sg-core/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.137583 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2d3ebf5b-f0c4-472e-b4a3-e5f8cab66ffe/proxy-httpd/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.149189 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-bqds9_09ee2ae5-f10a-4080-90df-29c01525e871/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.160362 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e4281fdf-eb56-41e8-a750-13ee7ac37bea/cinder-api-log/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.199027 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_e4281fdf-eb56-41e8-a750-13ee7ac37bea/cinder-api/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.257344 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0896fa5e-6919-42bf-9e61-cf73218e9edf/cinder-scheduler/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.278842 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0896fa5e-6919-42bf-9e61-cf73218e9edf/probe/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.316291 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-xdsct_7e4bbdff-6382-41c7-a054-bb15c6923e32/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.341848 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kt9mq_5e910d6d-e1c9-447a-9584-0338f9151f26/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.376859 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-bgxfr_248ea464-73a3-4083-bb27-fc2cb7347224/dnsmasq-dns/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.386549 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6c5d8cf46f-bgxfr_248ea464-73a3-4083-bb27-fc2cb7347224/init/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.438885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-rwbjg_5c493941-48f3-4a3e-a66a-4f045487005e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.491038 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-69cc8766db-gdch7_0edd5078-75bc-4823-b52f-ad5effeace06/keystone-api/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.498970 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29483461-qxsqs_4d585381-d477-4c8d-af17-6194044b6de1/keystone-cron/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.512077 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_60d4c6e3-4a01-421e-aad1-1972ed16e528/kube-state-metrics/0.log" Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.883588 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerStarted","Data":"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851"} Jan 21 15:01:47 crc kubenswrapper[4720]: I0121 15:01:47.910834 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6x2nt" podStartSLOduration=6.435078757 podStartE2EDuration="13.910815182s" podCreationTimestamp="2026-01-21 15:01:34 +0000 UTC" firstStartedPulling="2026-01-21 15:01:39.804827385 +0000 UTC m=+1937.713567317" lastFinishedPulling="2026-01-21 15:01:47.2805638 +0000 UTC m=+1945.189303742" observedRunningTime="2026-01-21 15:01:47.903032409 +0000 UTC m=+1945.811772341" watchObservedRunningTime="2026-01-21 15:01:47.910815182 +0000 UTC m=+1945.819555104" Jan 21 15:01:50 crc kubenswrapper[4720]: I0121 15:01:50.994458 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:01:50 crc kubenswrapper[4720]: I0121 15:01:50.996738 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.011447 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.111465 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.111606 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.111628 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.213829 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.213939 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.214029 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.217204 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.217225 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.232913 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"redhat-marketplace-sxdtl\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:51 crc kubenswrapper[4720]: I0121 15:01:51.321793 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:01:54 crc kubenswrapper[4720]: I0121 15:01:54.583855 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:54 crc kubenswrapper[4720]: I0121 15:01:54.584194 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:01:55 crc kubenswrapper[4720]: I0121 15:01:55.634441 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6x2nt" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" probeResult="failure" output=< Jan 21 15:01:55 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 15:01:55 crc kubenswrapper[4720]: > Jan 21 15:01:55 crc kubenswrapper[4720]: I0121 15:01:55.678080 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:01:55 crc kubenswrapper[4720]: E0121 15:01:55.678462 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:01:55 crc kubenswrapper[4720]: I0121 15:01:55.982999 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_73c29d26-d7a2-40b5-81b8-ffda85c198d3/memcached/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.015171 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8b4f85f7-4kz9x_7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7/neutron-api/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.029007 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6c8b4f85f7-4kz9x_7ae63cc8-b2c1-44a7-9630-7b151ee5e0b7/neutron-httpd/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.128062 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_33c62270-7ab4-416b-bf5f-e0007f477733/nova-api-log/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.229885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_33c62270-7ab4-416b-bf5f-e0007f477733/nova-api-api/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.355721 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_496cefe3-f97b-4d8c-9a25-4a6533d9e64c/nova-cell0-conductor-conductor/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.368775 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.454444 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_679bb64e-c157-415f-9214-0f4e62001f03/nova-cell1-conductor-conductor/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.505816 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_5ea3e3dd-0e39-4a28-9112-27f0874af221/nova-cell1-novncproxy-novncproxy/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.579424 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7177980c-4db3-4902-aac2-c0825b778b2a/nova-metadata-log/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.872974 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_7177980c-4db3-4902-aac2-c0825b778b2a/nova-metadata-metadata/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.961730 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_039c7115-f471-47ad-a7c4-75b1d7a40a94/nova-scheduler-scheduler/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.986432 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8a6a2220-24c4-4a0b-b72e-848dbac6a14b/galera/0.log" Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.993968 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" event={"ID":"4b489c8d-aa41-41cf-a984-9479eda75544","Type":"ContainerStarted","Data":"a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d"} Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.996452 4720 generic.go:334] "Generic (PLEG): container finished" podID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" exitCode=0 Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.996478 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992"} Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.996492 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerStarted","Data":"ec6aaee5e30b27a55c2206c76a5a2e84eaba6a236f2d843ee4cfb96336a189b9"} Jan 21 15:01:56 crc kubenswrapper[4720]: I0121 15:01:56.998645 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8a6a2220-24c4-4a0b-b72e-848dbac6a14b/mysql-bootstrap/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.031621 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab11441b-6bc4-4883-8a1e-866b31b425e9/galera/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.048432 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ab11441b-6bc4-4883-8a1e-866b31b425e9/mysql-bootstrap/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.048482 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" podStartSLOduration=1.7364206210000002 podStartE2EDuration="14.048470768s" podCreationTimestamp="2026-01-21 15:01:43 +0000 UTC" firstStartedPulling="2026-01-21 15:01:43.566104382 +0000 UTC m=+1941.474844314" lastFinishedPulling="2026-01-21 15:01:55.878154529 +0000 UTC m=+1953.786894461" observedRunningTime="2026-01-21 15:01:57.013256156 +0000 UTC m=+1954.921996088" watchObservedRunningTime="2026-01-21 15:01:57.048470768 +0000 UTC m=+1954.957210700" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.057465 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_4bb447ec-c7a1-4d3b-bcb7-e05d5ead9fa6/openstackclient/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.089262 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-h55pf_4fc0e40b-c337-42d2-87a3-2eedfa2f1a65/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.106324 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2v7f2_04da7387-73aa-43e0-b547-7ce56e71d865/ovsdb-server/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.114497 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2v7f2_04da7387-73aa-43e0-b547-7ce56e71d865/ovs-vswitchd/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.122542 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-2v7f2_04da7387-73aa-43e0-b547-7ce56e71d865/ovsdb-server-init/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.133369 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-wpvzs_95379233-3cd8-4dd3-bf0f-b8198f2258e1/ovn-controller/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.146331 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_262f8354-3f7b-483f-940d-8b0f394e344a/ovn-northd/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.153579 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_262f8354-3f7b-483f-940d-8b0f394e344a/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.172379 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8cf4740-b779-4759-92d1-22ce3e5f1369/ovsdbserver-nb/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.180433 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_e8cf4740-b779-4759-92d1-22ce3e5f1369/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.195429 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4b833ac6-f279-4dfb-84fb-22b531e6b7ef/ovsdbserver-sb/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.201016 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4b833ac6-f279-4dfb-84fb-22b531e6b7ef/openstack-network-exporter/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.228077 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8648996d7d-4f2q4_37e9aac3-9710-4d1c-88a7-1a0a22b5a593/placement-log/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.242540 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-8648996d7d-4f2q4_37e9aac3-9710-4d1c-88a7-1a0a22b5a593/placement-api/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.258895 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4906b5ed-c663-4e81-ab33-2b8f33777cd1/rabbitmq/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.267498 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4906b5ed-c663-4e81-ab33-2b8f33777cd1/setup-container/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.286366 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f73dd82b-9ad1-4deb-b244-6d42a3f25f89/rabbitmq/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.292572 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f73dd82b-9ad1-4deb-b244-6d42a3f25f89/setup-container/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.310419 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b24tr_64e0dfca-6b74-47c9-8f6f-76de697cf3e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.321801 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-9l2xt_0506243d-6216-4541-8f14-8b2c2beb409b/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.332758 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bzgnr_595ce90e-f537-4d7f-be8f-a4da40103ab1/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.360944 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-4ngb6_d64c2129-c3c8-4f00-ac2f-750094e2ea79/ssh-known-hosts-edpm-deployment/0.log" Jan 21 15:01:57 crc kubenswrapper[4720]: I0121 15:01:57.376360 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-v8kvd_1708e39a-582c-42e2-8c2e-d71fef75a183/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 21 15:01:59 crc kubenswrapper[4720]: I0121 15:01:59.016989 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerStarted","Data":"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed"} Jan 21 15:02:00 crc kubenswrapper[4720]: I0121 15:02:00.044807 4720 scope.go:117] "RemoveContainer" containerID="08798f35f080deb2759dc17480e0acb520080e74f20bec131db2674bbfdecfac" Jan 21 15:02:01 crc kubenswrapper[4720]: I0121 15:02:01.034801 4720 generic.go:334] "Generic (PLEG): container finished" podID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" exitCode=0 Jan 21 15:02:01 crc kubenswrapper[4720]: I0121 15:02:01.034942 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed"} Jan 21 15:02:04 crc kubenswrapper[4720]: I0121 15:02:04.076684 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerStarted","Data":"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df"} Jan 21 15:02:04 crc kubenswrapper[4720]: I0121 15:02:04.100987 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxdtl" podStartSLOduration=7.461914655 podStartE2EDuration="14.100969927s" podCreationTimestamp="2026-01-21 15:01:50 +0000 UTC" firstStartedPulling="2026-01-21 15:01:56.998237375 +0000 UTC m=+1954.906977307" lastFinishedPulling="2026-01-21 15:02:03.637292647 +0000 UTC m=+1961.546032579" observedRunningTime="2026-01-21 15:02:04.100951717 +0000 UTC m=+1962.009691659" watchObservedRunningTime="2026-01-21 15:02:04.100969927 +0000 UTC m=+1962.009709859" Jan 21 15:02:05 crc kubenswrapper[4720]: I0121 15:02:05.678160 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6x2nt" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" probeResult="failure" output=< Jan 21 15:02:05 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 15:02:05 crc kubenswrapper[4720]: > Jan 21 15:02:08 crc kubenswrapper[4720]: I0121 15:02:08.390280 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/controller/0.log" Jan 21 15:02:08 crc kubenswrapper[4720]: I0121 15:02:08.396707 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/kube-rbac-proxy/0.log" Jan 21 15:02:08 crc kubenswrapper[4720]: I0121 15:02:08.411822 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/controller/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.409443 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.419126 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/reloader/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.429216 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr-metrics/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.441013 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.449752 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy-frr/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.457981 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-frr-files/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.467846 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-reloader/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.477914 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-metrics/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.494344 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lsrs9_8ba45f1e-4559-4408-b129-b061d406fce6/frr-k8s-webhook-server/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.520930 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8c8cff46-cbv67_b6fdd799-fe82-4cd7-b825-c755b6189180/manager/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.542212 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75df998c5f-tnbdz_6c334ce5-b6c7-40c8-a261-5a5084ae3db8/webhook-server/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.817313 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/speaker/0.log" Jan 21 15:02:09 crc kubenswrapper[4720]: I0121 15:02:09.829766 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/kube-rbac-proxy/0.log" Jan 21 15:02:10 crc kubenswrapper[4720]: I0121 15:02:10.678495 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:02:10 crc kubenswrapper[4720]: E0121 15:02:10.678921 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:02:11 crc kubenswrapper[4720]: I0121 15:02:11.322981 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:11 crc kubenswrapper[4720]: I0121 15:02:11.323378 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:11 crc kubenswrapper[4720]: I0121 15:02:11.385926 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.134440 4720 generic.go:334] "Generic (PLEG): container finished" podID="4b489c8d-aa41-41cf-a984-9479eda75544" containerID="a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d" exitCode=0 Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.134530 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" event={"ID":"4b489c8d-aa41-41cf-a984-9479eda75544","Type":"ContainerDied","Data":"a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d"} Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.180033 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:12 crc kubenswrapper[4720]: I0121 15:02:12.230284 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.271324 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.304993 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-fblvn"] Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.311661 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-fblvn"] Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.340517 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") pod \"4b489c8d-aa41-41cf-a984-9479eda75544\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.340570 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") pod \"4b489c8d-aa41-41cf-a984-9479eda75544\" (UID: \"4b489c8d-aa41-41cf-a984-9479eda75544\") " Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.340832 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host" (OuterVolumeSpecName: "host") pod "4b489c8d-aa41-41cf-a984-9479eda75544" (UID: "4b489c8d-aa41-41cf-a984-9479eda75544"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.341892 4720 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b489c8d-aa41-41cf-a984-9479eda75544-host\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.349970 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx" (OuterVolumeSpecName: "kube-api-access-m6pkx") pod "4b489c8d-aa41-41cf-a984-9479eda75544" (UID: "4b489c8d-aa41-41cf-a984-9479eda75544"). InnerVolumeSpecName "kube-api-access-m6pkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:13 crc kubenswrapper[4720]: I0121 15:02:13.443983 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6pkx\" (UniqueName: \"kubernetes.io/projected/4b489c8d-aa41-41cf-a984-9479eda75544-kube-api-access-m6pkx\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.152643 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-fblvn" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.152646 4720 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51adffb4e5148a9253b0ff230c11ab3a42ce9ad43dec9586d4d824fd37aed830" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.152953 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxdtl" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" containerID="cri-o://7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" gracePeriod=2 Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.564075 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-tsfvh"] Jan 21 15:02:14 crc kubenswrapper[4720]: E0121 15:02:14.564775 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" containerName="container-00" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.564791 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" containerName="container-00" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.565039 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" containerName="container-00" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.565782 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.567905 4720 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ltcrl"/"default-dockercfg-bztpl" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.603011 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.654535 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663451 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") pod \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663489 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") pod \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663759 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") pod \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\" (UID: \"4f558038-e16a-4aa1-bb7b-ddb6f14987a7\") " Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.663987 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.664051 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.665109 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities" (OuterVolumeSpecName: "utilities") pod "4f558038-e16a-4aa1-bb7b-ddb6f14987a7" (UID: "4f558038-e16a-4aa1-bb7b-ddb6f14987a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.684036 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm" (OuterVolumeSpecName: "kube-api-access-lmktm") pod "4f558038-e16a-4aa1-bb7b-ddb6f14987a7" (UID: "4f558038-e16a-4aa1-bb7b-ddb6f14987a7"). InnerVolumeSpecName "kube-api-access-lmktm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.698845 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b489c8d-aa41-41cf-a984-9479eda75544" path="/var/lib/kubelet/pods/4b489c8d-aa41-41cf-a984-9479eda75544/volumes" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.700469 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f558038-e16a-4aa1-bb7b-ddb6f14987a7" (UID: "4f558038-e16a-4aa1-bb7b-ddb6f14987a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.723237 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.765576 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.765696 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.765948 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.766781 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.766809 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.766823 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmktm\" (UniqueName: \"kubernetes.io/projected/4f558038-e16a-4aa1-bb7b-ddb6f14987a7-kube-api-access-lmktm\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.786791 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"crc-debug-tsfvh\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: I0121 15:02:14.935137 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:14 crc kubenswrapper[4720]: W0121 15:02:14.956322 4720 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf099413_bd8b_4037_89d4_60155f99f19e.slice/crio-2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910 WatchSource:0}: Error finding container 2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910: Status 404 returned error can't find the container with id 2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910 Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.027262 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164127 4720 generic.go:334] "Generic (PLEG): container finished" podID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" exitCode=0 Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164230 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164302 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxdtl" event={"ID":"4f558038-e16a-4aa1-bb7b-ddb6f14987a7","Type":"ContainerDied","Data":"ec6aaee5e30b27a55c2206c76a5a2e84eaba6a236f2d843ee4cfb96336a189b9"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164335 4720 scope.go:117] "RemoveContainer" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.164542 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxdtl" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.166392 4720 generic.go:334] "Generic (PLEG): container finished" podID="df099413-bd8b-4037-89d4-60155f99f19e" containerID="299c1e617983776558f282014d4b14f16c1ee5a3630b9cf95ccb65cd32d55d37" exitCode=1 Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.166580 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" event={"ID":"df099413-bd8b-4037-89d4-60155f99f19e","Type":"ContainerDied","Data":"299c1e617983776558f282014d4b14f16c1ee5a3630b9cf95ccb65cd32d55d37"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.166672 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" event={"ID":"df099413-bd8b-4037-89d4-60155f99f19e","Type":"ContainerStarted","Data":"2339930d951d09813c333822dd382b293faa6aa1833f73cdffaa2cac3ab11910"} Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.185369 4720 scope.go:117] "RemoveContainer" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.215152 4720 scope.go:117] "RemoveContainer" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.218846 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.229466 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxdtl"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.238494 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-tsfvh"] Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.244207 4720 scope.go:117] "RemoveContainer" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" Jan 21 15:02:15 crc kubenswrapper[4720]: E0121 15:02:15.244536 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df\": container with ID starting with 7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df not found: ID does not exist" containerID="7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.244565 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df"} err="failed to get container status \"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df\": rpc error: code = NotFound desc = could not find container \"7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df\": container with ID starting with 7bd6838694b592aaa035d463ea22b7f37cf62840e364dfb62000218aab28c0df not found: ID does not exist" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.244584 4720 scope.go:117] "RemoveContainer" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" Jan 21 15:02:15 crc kubenswrapper[4720]: E0121 15:02:15.245050 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed\": container with ID starting with a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed not found: ID does not exist" containerID="a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.245075 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed"} err="failed to get container status \"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed\": rpc error: code = NotFound desc = could not find container \"a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed\": container with ID starting with a1023d7679500ef93dc8f6651cd620c7b2b00e6d045eb90b1e889e744e4df3ed not found: ID does not exist" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.245089 4720 scope.go:117] "RemoveContainer" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" Jan 21 15:02:15 crc kubenswrapper[4720]: E0121 15:02:15.245611 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992\": container with ID starting with fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992 not found: ID does not exist" containerID="fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.245686 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992"} err="failed to get container status \"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992\": rpc error: code = NotFound desc = could not find container \"fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992\": container with ID starting with fdc8c3f9dedb691d525e6b32624691f58b5d6a0081fa1325fe2862476431a992 not found: ID does not exist" Jan 21 15:02:15 crc kubenswrapper[4720]: I0121 15:02:15.247164 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ltcrl/crc-debug-tsfvh"] Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.174445 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6x2nt" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" containerID="cri-o://ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" gracePeriod=2 Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.356113 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.396469 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") pod \"df099413-bd8b-4037-89d4-60155f99f19e\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.396560 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") pod \"df099413-bd8b-4037-89d4-60155f99f19e\" (UID: \"df099413-bd8b-4037-89d4-60155f99f19e\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.397684 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host" (OuterVolumeSpecName: "host") pod "df099413-bd8b-4037-89d4-60155f99f19e" (UID: "df099413-bd8b-4037-89d4-60155f99f19e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.419917 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv" (OuterVolumeSpecName: "kube-api-access-9dfnv") pod "df099413-bd8b-4037-89d4-60155f99f19e" (UID: "df099413-bd8b-4037-89d4-60155f99f19e"). InnerVolumeSpecName "kube-api-access-9dfnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.504522 4720 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df099413-bd8b-4037-89d4-60155f99f19e-host\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.504567 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dfnv\" (UniqueName: \"kubernetes.io/projected/df099413-bd8b-4037-89d4-60155f99f19e-kube-api-access-9dfnv\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.660905 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.688812 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" path="/var/lib/kubelet/pods/4f558038-e16a-4aa1-bb7b-ddb6f14987a7/volumes" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.689556 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df099413-bd8b-4037-89d4-60155f99f19e" path="/var/lib/kubelet/pods/df099413-bd8b-4037-89d4-60155f99f19e/volumes" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.707112 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") pod \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.707325 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") pod \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.707383 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") pod \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\" (UID: \"26e526e0-a293-4e24-a0b3-cc7fa0e9308b\") " Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.721411 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk" (OuterVolumeSpecName: "kube-api-access-l5vdk") pod "26e526e0-a293-4e24-a0b3-cc7fa0e9308b" (UID: "26e526e0-a293-4e24-a0b3-cc7fa0e9308b"). InnerVolumeSpecName "kube-api-access-l5vdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.726339 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities" (OuterVolumeSpecName: "utilities") pod "26e526e0-a293-4e24-a0b3-cc7fa0e9308b" (UID: "26e526e0-a293-4e24-a0b3-cc7fa0e9308b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.810556 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5vdk\" (UniqueName: \"kubernetes.io/projected/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-kube-api-access-l5vdk\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.810601 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.881198 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26e526e0-a293-4e24-a0b3-cc7fa0e9308b" (UID: "26e526e0-a293-4e24-a0b3-cc7fa0e9308b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:02:16 crc kubenswrapper[4720]: I0121 15:02:16.912945 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26e526e0-a293-4e24-a0b3-cc7fa0e9308b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.184370 4720 scope.go:117] "RemoveContainer" containerID="299c1e617983776558f282014d4b14f16c1ee5a3630b9cf95ccb65cd32d55d37" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.184385 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/crc-debug-tsfvh" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.186722 4720 generic.go:334] "Generic (PLEG): container finished" podID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" exitCode=0 Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.186772 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6x2nt" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.186771 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851"} Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.187490 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6x2nt" event={"ID":"26e526e0-a293-4e24-a0b3-cc7fa0e9308b","Type":"ContainerDied","Data":"50874e76ea4107b7c07d6f1ccee98b03d59ab44fcfb2f73925bdd79450642dc9"} Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.219885 4720 scope.go:117] "RemoveContainer" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.232145 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.240469 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6x2nt"] Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.242562 4720 scope.go:117] "RemoveContainer" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.261905 4720 scope.go:117] "RemoveContainer" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.278577 4720 scope.go:117] "RemoveContainer" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" Jan 21 15:02:17 crc kubenswrapper[4720]: E0121 15:02:17.278964 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851\": container with ID starting with ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851 not found: ID does not exist" containerID="ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279004 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851"} err="failed to get container status \"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851\": rpc error: code = NotFound desc = could not find container \"ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851\": container with ID starting with ca3be57ac007ed5872cbc69b80c1e3b190a6a88999004e016e12d63ec17af851 not found: ID does not exist" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279023 4720 scope.go:117] "RemoveContainer" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" Jan 21 15:02:17 crc kubenswrapper[4720]: E0121 15:02:17.279287 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d\": container with ID starting with 6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d not found: ID does not exist" containerID="6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279355 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d"} err="failed to get container status \"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d\": rpc error: code = NotFound desc = could not find container \"6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d\": container with ID starting with 6bb1fadb296f4f7a4e177b677ea368708667b26d03ffd5e4381a0ce2b42c380d not found: ID does not exist" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279392 4720 scope.go:117] "RemoveContainer" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" Jan 21 15:02:17 crc kubenswrapper[4720]: E0121 15:02:17.279706 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7\": container with ID starting with a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7 not found: ID does not exist" containerID="a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7" Jan 21 15:02:17 crc kubenswrapper[4720]: I0121 15:02:17.279755 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7"} err="failed to get container status \"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7\": rpc error: code = NotFound desc = could not find container \"a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7\": container with ID starting with a4664b70456dd4409bc474ee18c3c107d7f7a37fd01804bf2b091e5aafe215e7 not found: ID does not exist" Jan 21 15:02:18 crc kubenswrapper[4720]: I0121 15:02:18.688951 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" path="/var/lib/kubelet/pods/26e526e0-a293-4e24-a0b3-cc7fa0e9308b/volumes" Jan 21 15:02:23 crc kubenswrapper[4720]: I0121 15:02:23.678189 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:02:24 crc kubenswrapper[4720]: I0121 15:02:24.244053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7"} Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.711417 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-q2t2m_655f8c6a-4936-45d3-9538-66ee77a050d3/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.745407 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/extract/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.755551 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/util/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.765741 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/pull/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.814461 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-wnzfm_b7ea6739-9c38-44a0-a382-8b26e37138fa/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.827346 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-bjn2r_96218341-1cf7-4aa1-bb9a-7a7abba7a93e/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.873630 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-gwlgm_6c93648a-7076-4d91-ac7a-f389ab1159cc/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.886050 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-bl4z8_9a5569f7-371f-4663-b005-5fdcce36936b/manager/0.log" Jan 21 15:02:30 crc kubenswrapper[4720]: I0121 15:02:30.901885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vfxfh_071d4469-5b09-49a3-97f4-239d811825a2/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.145036 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xtpbn_b80cffaf-5853-47ac-b783-c26da64425ff/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.156968 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-glbt4_9b467fa8-1984-4659-8873-99c20204b16b/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.215728 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-54hwg_085a2e93-1496-47f3-a7dc-4acae2e201fc/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.227821 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-n5bwd_370e5a87-5edf-4d48-9b65-335400a84cd2/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.266836 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-v4fbm_589a442f-27a6-4d23-85dd-9e5b1556363f/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.307896 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d22bk_c38df2a4-6626-4b71-9dcd-7ef3003ee693/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.388708 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vzzmp_bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.398812 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-pw4z6_9695fd09-d135-426b-a129-66f945d2dd90/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.421606 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw_88e81fdb-6501-410c-9452-d3ba7f41a30d/manager/0.log" Jan 21 15:02:31 crc kubenswrapper[4720]: I0121 15:02:31.582863 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68fc899677-pbmmn_d3800217-b53a-4788-a9d4-8861cfdb68a1/operator/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.409542 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d47656bc9-4hjmr_eb81b686-832a-414b-aa66-cf40a72a7427/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.416996 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j4xn9_5d59157d-f538-4cb0-959d-11584d7678c5/registry-server/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.465604 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-689zh_88327b24-ce00-4bb4-98d1-24060c6dbf28/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.487860 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2clln_18ce7f0d-00de-4a92-97f2-743d9057abff/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.510410 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mm7cg_8db4bced-5679-43ab-a5c9-ba87574aaa02/operator/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.518683 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-4tjlt_a2557af5-c155-4d37-9b9a-f9335cac47b1/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.604872 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-8hrkh_a050e31c-3d6d-490c-8f74-637c37c96a5e/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.616847 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-xczlv_cd17e86c-5586-4ea9-979d-2c195494fe99/manager/0.log" Jan 21 15:02:32 crc kubenswrapper[4720]: I0121 15:02:32.627438 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jfkfq_de2e9655-961c-4250-9852-332dfe335b4a/manager/0.log" Jan 21 15:02:37 crc kubenswrapper[4720]: I0121 15:02:37.429294 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jtj6g_48af697e-308a-4bdd-a5d8-d86cd5c4fb0c/control-plane-machine-set-operator/0.log" Jan 21 15:02:37 crc kubenswrapper[4720]: I0121 15:02:37.445848 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/kube-rbac-proxy/0.log" Jan 21 15:02:37 crc kubenswrapper[4720]: I0121 15:02:37.451426 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/machine-api-operator/0.log" Jan 21 15:02:43 crc kubenswrapper[4720]: I0121 15:02:43.093479 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d6jp2_4eec0898-8a1a-47d9-ac37-62cfe6c7b857/cert-manager-controller/0.log" Jan 21 15:02:43 crc kubenswrapper[4720]: I0121 15:02:43.105695 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-c4tn5_4939bfdd-b3b4-4850-8b5d-3399548ad5a0/cert-manager-cainjector/0.log" Jan 21 15:02:43 crc kubenswrapper[4720]: I0121 15:02:43.112420 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vflwv_0236eaa4-e5d8-4699-82f8-1e9648f95dc8/cert-manager-webhook/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.080601 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-f9sxz_e3d11ff0-1741-4f0d-aa50-6e0144e843a6/nmstate-console-plugin/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.099333 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-l74mh_da16493b-aa03-4556-b3ce-d87ccfdbba70/nmstate-handler/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.111922 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/nmstate-metrics/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.122476 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/kube-rbac-proxy/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.136301 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mclmr_2bdd7be0-b9cf-4501-9816-87831d74becc/nmstate-operator/0.log" Jan 21 15:02:48 crc kubenswrapper[4720]: I0121 15:02:48.146559 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-xcckr_c338dc84-0c3a-44c4-8f08-82001f532c2b/nmstate-webhook/0.log" Jan 21 15:02:59 crc kubenswrapper[4720]: I0121 15:02:59.103346 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/controller/0.log" Jan 21 15:02:59 crc kubenswrapper[4720]: I0121 15:02:59.110725 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/kube-rbac-proxy/0.log" Jan 21 15:02:59 crc kubenswrapper[4720]: I0121 15:02:59.135783 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/controller/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.040801 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.056605 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/reloader/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.062613 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr-metrics/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.067565 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.079448 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy-frr/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.084004 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-frr-files/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.092591 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-reloader/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.100743 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-metrics/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.120418 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lsrs9_8ba45f1e-4559-4408-b129-b061d406fce6/frr-k8s-webhook-server/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.145013 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8c8cff46-cbv67_b6fdd799-fe82-4cd7-b825-c755b6189180/manager/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.154326 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75df998c5f-tnbdz_6c334ce5-b6c7-40c8-a261-5a5084ae3db8/webhook-server/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.421319 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/speaker/0.log" Jan 21 15:03:00 crc kubenswrapper[4720]: I0121 15:03:00.434721 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/kube-rbac-proxy/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.317803 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb_93611686-cfcc-4f9b-985d-a8e0d9cb7219/extract/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.327868 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb_93611686-cfcc-4f9b-985d-a8e0d9cb7219/util/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.342552 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4q7hb_93611686-cfcc-4f9b-985d-a8e0d9cb7219/pull/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.356217 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw_d714bdab-c0dc-4710-bae5-ec08841d2c0d/extract/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.368638 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw_d714bdab-c0dc-4710-bae5-ec08841d2c0d/util/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.392967 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7132b5zw_d714bdab-c0dc-4710-bae5-ec08841d2c0d/pull/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.646722 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kb2c7_c9a5b258-9d31-4031-85f0-1c8d00da3dda/registry-server/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.652866 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kb2c7_c9a5b258-9d31-4031-85f0-1c8d00da3dda/extract-utilities/0.log" Jan 21 15:03:05 crc kubenswrapper[4720]: I0121 15:03:05.670481 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-kb2c7_c9a5b258-9d31-4031-85f0-1c8d00da3dda/extract-content/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.007963 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bqrkw_f9a3c893-2903-4355-9af3-b8f981477494/registry-server/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.012919 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bqrkw_f9a3c893-2903-4355-9af3-b8f981477494/extract-utilities/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.018571 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bqrkw_f9a3c893-2903-4355-9af3-b8f981477494/extract-content/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.031598 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s9hd2_fff1b41d-1ec5-4928-8a7c-a1fd545f8e8c/marketplace-operator/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.174142 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7fb4w_1f47a635-f04f-4002-a264-f10be8c70e10/registry-server/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.194300 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7fb4w_1f47a635-f04f-4002-a264-f10be8c70e10/extract-utilities/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.203780 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-7fb4w_1f47a635-f04f-4002-a264-f10be8c70e10/extract-content/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.558957 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4hxc8_86ba467d-dfbe-493b-acf6-17b938a753b0/registry-server/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.564575 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4hxc8_86ba467d-dfbe-493b-acf6-17b938a753b0/extract-utilities/0.log" Jan 21 15:03:06 crc kubenswrapper[4720]: I0121 15:03:06.572251 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-4hxc8_86ba467d-dfbe-493b-acf6-17b938a753b0/extract-content/0.log" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.711998 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.712932 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df099413-bd8b-4037-89d4-60155f99f19e" containerName="container-00" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.712951 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="df099413-bd8b-4037-89d4-60155f99f19e" containerName="container-00" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.712983 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.712993 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713006 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713016 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713025 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713032 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713051 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713060 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-content" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713075 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713082 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="extract-utilities" Jan 21 15:03:22 crc kubenswrapper[4720]: E0121 15:03:22.713095 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713103 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713336 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e526e0-a293-4e24-a0b3-cc7fa0e9308b" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713366 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="df099413-bd8b-4037-89d4-60155f99f19e" containerName="container-00" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.713380 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f558038-e16a-4aa1-bb7b-ddb6f14987a7" containerName="registry-server" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.715122 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.725677 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.779524 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.779955 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.780015 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.881675 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.881753 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.881778 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.882280 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.882496 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:22 crc kubenswrapper[4720]: I0121 15:03:22.903240 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"community-operators-vlw79\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:23 crc kubenswrapper[4720]: I0121 15:03:23.037907 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:23 crc kubenswrapper[4720]: I0121 15:03:23.598693 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:23 crc kubenswrapper[4720]: I0121 15:03:23.736392 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerStarted","Data":"0d606b6c4d92032c697bb7849cf5732036ee5f28ade04eb215c43e2f1cfdd6be"} Jan 21 15:03:24 crc kubenswrapper[4720]: I0121 15:03:24.750034 4720 generic.go:334] "Generic (PLEG): container finished" podID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerID="977744b7bf9183763fc0fbfe01571d5a8dfc1595622586d10028ba776c5ca735" exitCode=0 Jan 21 15:03:24 crc kubenswrapper[4720]: I0121 15:03:24.751444 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"977744b7bf9183763fc0fbfe01571d5a8dfc1595622586d10028ba776c5ca735"} Jan 21 15:03:26 crc kubenswrapper[4720]: I0121 15:03:26.770298 4720 generic.go:334] "Generic (PLEG): container finished" podID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerID="aec074b3448f19a406451b31f8106a118df1722470ad7209f21e49b82cc05330" exitCode=0 Jan 21 15:03:26 crc kubenswrapper[4720]: I0121 15:03:26.770338 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"aec074b3448f19a406451b31f8106a118df1722470ad7209f21e49b82cc05330"} Jan 21 15:03:27 crc kubenswrapper[4720]: I0121 15:03:27.795053 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerStarted","Data":"0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33"} Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.681787 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vlw79" podStartSLOduration=6.014809144 podStartE2EDuration="8.681763889s" podCreationTimestamp="2026-01-21 15:03:22 +0000 UTC" firstStartedPulling="2026-01-21 15:03:24.753083393 +0000 UTC m=+2042.661823325" lastFinishedPulling="2026-01-21 15:03:27.420038128 +0000 UTC m=+2045.328778070" observedRunningTime="2026-01-21 15:03:27.824294195 +0000 UTC m=+2045.733034137" watchObservedRunningTime="2026-01-21 15:03:30.681763889 +0000 UTC m=+2048.590503831" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.690773 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.693062 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.705643 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.744773 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgkkm\" (UniqueName: \"kubernetes.io/projected/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-kube-api-access-sgkkm\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.745020 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-catalog-content\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.745063 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-utilities\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846127 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-utilities\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846192 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgkkm\" (UniqueName: \"kubernetes.io/projected/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-kube-api-access-sgkkm\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846331 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-catalog-content\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846722 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-utilities\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.846760 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-catalog-content\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:30 crc kubenswrapper[4720]: I0121 15:03:30.877016 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgkkm\" (UniqueName: \"kubernetes.io/projected/2bbd360e-7396-4ec2-bc33-d4c909b4c7e4-kube-api-access-sgkkm\") pod \"certified-operators-bcpwx\" (UID: \"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4\") " pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:31 crc kubenswrapper[4720]: I0121 15:03:31.013912 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:31 crc kubenswrapper[4720]: I0121 15:03:31.681836 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:31 crc kubenswrapper[4720]: I0121 15:03:31.826911 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerStarted","Data":"832a33e027291133d6801641b6adbeaae84d163c91a9f269a3243eee008052ee"} Jan 21 15:03:32 crc kubenswrapper[4720]: I0121 15:03:32.838238 4720 generic.go:334] "Generic (PLEG): container finished" podID="2bbd360e-7396-4ec2-bc33-d4c909b4c7e4" containerID="9f2b6912011fb3da8fb122decfaf75a67e87b0a016def09d17bba5e8af3a0bf9" exitCode=0 Jan 21 15:03:32 crc kubenswrapper[4720]: I0121 15:03:32.838546 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerDied","Data":"9f2b6912011fb3da8fb122decfaf75a67e87b0a016def09d17bba5e8af3a0bf9"} Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.039482 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.039544 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.093152 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:33 crc kubenswrapper[4720]: I0121 15:03:33.911085 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:37 crc kubenswrapper[4720]: I0121 15:03:37.883480 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:37 crc kubenswrapper[4720]: I0121 15:03:37.884058 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vlw79" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" containerID="cri-o://0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33" gracePeriod=2 Jan 21 15:03:38 crc kubenswrapper[4720]: I0121 15:03:38.894812 4720 generic.go:334] "Generic (PLEG): container finished" podID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerID="0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33" exitCode=0 Jan 21 15:03:38 crc kubenswrapper[4720]: I0121 15:03:38.894889 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33"} Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.914069 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.923504 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vlw79" event={"ID":"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4","Type":"ContainerDied","Data":"0d606b6c4d92032c697bb7849cf5732036ee5f28ade04eb215c43e2f1cfdd6be"} Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.923553 4720 scope.go:117] "RemoveContainer" containerID="0d46e2daa19e7deb8759de3621890012a927a9fe20a077c93d91e0b0c39a3b33" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.923595 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vlw79" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.971897 4720 scope.go:117] "RemoveContainer" containerID="aec074b3448f19a406451b31f8106a118df1722470ad7209f21e49b82cc05330" Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.997436 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") pod \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.997610 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") pod \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.997645 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") pod \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\" (UID: \"bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4\") " Jan 21 15:03:41 crc kubenswrapper[4720]: I0121 15:03:41.999331 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities" (OuterVolumeSpecName: "utilities") pod "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" (UID: "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.015076 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g" (OuterVolumeSpecName: "kube-api-access-pz52g") pod "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" (UID: "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4"). InnerVolumeSpecName "kube-api-access-pz52g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.076882 4720 scope.go:117] "RemoveContainer" containerID="977744b7bf9183763fc0fbfe01571d5a8dfc1595622586d10028ba776c5ca735" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.099987 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.100030 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz52g\" (UniqueName: \"kubernetes.io/projected/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-kube-api-access-pz52g\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.113583 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" (UID: "bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.202244 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.267027 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.274750 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vlw79"] Jan 21 15:03:42 crc kubenswrapper[4720]: E0121 15:03:42.296249 4720 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfad437f_5d3f_4e35_88e2_1ee6a3a4b6e4.slice/crio-0d606b6c4d92032c697bb7849cf5732036ee5f28ade04eb215c43e2f1cfdd6be\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bbd360e_7396_4ec2_bc33_d4c909b4c7e4.slice/crio-2c905405b047deb5d4e5c601c3a7cc9b8fd2ca0129f40a148472618d9dc837db.scope\": RecentStats: unable to find data in memory cache]" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.688258 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" path="/var/lib/kubelet/pods/bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4/volumes" Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.936700 4720 generic.go:334] "Generic (PLEG): container finished" podID="2bbd360e-7396-4ec2-bc33-d4c909b4c7e4" containerID="2c905405b047deb5d4e5c601c3a7cc9b8fd2ca0129f40a148472618d9dc837db" exitCode=0 Jan 21 15:03:42 crc kubenswrapper[4720]: I0121 15:03:42.936781 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerDied","Data":"2c905405b047deb5d4e5c601c3a7cc9b8fd2ca0129f40a148472618d9dc837db"} Jan 21 15:03:45 crc kubenswrapper[4720]: I0121 15:03:45.963383 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bcpwx" event={"ID":"2bbd360e-7396-4ec2-bc33-d4c909b4c7e4","Type":"ContainerStarted","Data":"2e0f732f0a5c1ef346767faf4b3f12df05889a1ed2ccb85a7b3529777aa53ac6"} Jan 21 15:03:45 crc kubenswrapper[4720]: I0121 15:03:45.990284 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bcpwx" podStartSLOduration=3.989257816 podStartE2EDuration="15.990266509s" podCreationTimestamp="2026-01-21 15:03:30 +0000 UTC" firstStartedPulling="2026-01-21 15:03:32.84003234 +0000 UTC m=+2050.748772272" lastFinishedPulling="2026-01-21 15:03:44.841041033 +0000 UTC m=+2062.749780965" observedRunningTime="2026-01-21 15:03:45.98245057 +0000 UTC m=+2063.891190502" watchObservedRunningTime="2026-01-21 15:03:45.990266509 +0000 UTC m=+2063.899006441" Jan 21 15:03:51 crc kubenswrapper[4720]: I0121 15:03:51.014367 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:51 crc kubenswrapper[4720]: I0121 15:03:51.014972 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:51 crc kubenswrapper[4720]: I0121 15:03:51.065287 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.068595 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bcpwx" Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.172713 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bcpwx"] Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.211998 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 15:03:52 crc kubenswrapper[4720]: I0121 15:03:52.212225 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kb2c7" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" containerID="cri-o://e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa" gracePeriod=2 Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.031516 4720 generic.go:334] "Generic (PLEG): container finished" podID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerID="e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa" exitCode=0 Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.031796 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa"} Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.912815 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.934175 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") pod \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.934295 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") pod \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.934332 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") pod \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\" (UID: \"c9a5b258-9d31-4031-85f0-1c8d00da3dda\") " Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.937076 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities" (OuterVolumeSpecName: "utilities") pod "c9a5b258-9d31-4031-85f0-1c8d00da3dda" (UID: "c9a5b258-9d31-4031-85f0-1c8d00da3dda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:53 crc kubenswrapper[4720]: I0121 15:03:53.943596 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg" (OuterVolumeSpecName: "kube-api-access-rfcfg") pod "c9a5b258-9d31-4031-85f0-1c8d00da3dda" (UID: "c9a5b258-9d31-4031-85f0-1c8d00da3dda"). InnerVolumeSpecName "kube-api-access-rfcfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.034391 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9a5b258-9d31-4031-85f0-1c8d00da3dda" (UID: "c9a5b258-9d31-4031-85f0-1c8d00da3dda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.036420 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.036498 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcfg\" (UniqueName: \"kubernetes.io/projected/c9a5b258-9d31-4031-85f0-1c8d00da3dda-kube-api-access-rfcfg\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.036568 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9a5b258-9d31-4031-85f0-1c8d00da3dda-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.047540 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kb2c7" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.048617 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kb2c7" event={"ID":"c9a5b258-9d31-4031-85f0-1c8d00da3dda","Type":"ContainerDied","Data":"802779e03cb8d5a94886d5052622d2f225df3c745fbe9ca0d9b7f323d0685420"} Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.048691 4720 scope.go:117] "RemoveContainer" containerID="e090310451bfa0ea474a10a9ee80aac36797337db2c5a79361cb32bef9c0d9aa" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.096482 4720 scope.go:117] "RemoveContainer" containerID="c6d0ea8c2e2121a74778b256a77f4b032d4f796cda4fbfab99f77a84e3288124" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.098386 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.107975 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kb2c7"] Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.139083 4720 scope.go:117] "RemoveContainer" containerID="43eba3433cb18996557abdfca43416ddb338165d69b1ca200a34d85ce638dbbb" Jan 21 15:03:54 crc kubenswrapper[4720]: I0121 15:03:54.686874 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" path="/var/lib/kubelet/pods/c9a5b258-9d31-4031-85f0-1c8d00da3dda/volumes" Jan 21 15:04:33 crc kubenswrapper[4720]: I0121 15:04:33.928594 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/controller/0.log" Jan 21 15:04:33 crc kubenswrapper[4720]: I0121 15:04:33.938510 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-72sfn_51379103-8c08-45c6-a0f3-86928d43bd50/kube-rbac-proxy/0.log" Jan 21 15:04:33 crc kubenswrapper[4720]: I0121 15:04:33.954690 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/controller/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.085210 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d6jp2_4eec0898-8a1a-47d9-ac37-62cfe6c7b857/cert-manager-controller/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.102535 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-c4tn5_4939bfdd-b3b4-4850-8b5d-3399548ad5a0/cert-manager-cainjector/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.132452 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vflwv_0236eaa4-e5d8-4699-82f8-1e9648f95dc8/cert-manager-webhook/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.934306 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.944418 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/reloader/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.950717 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/frr-metrics/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.959580 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.968335 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/kube-rbac-proxy-frr/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.975559 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-frr-files/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.982678 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-reloader/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.989112 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-ldp4q_bc431866-4baf-47fc-8767-705a11b9bea0/cp-metrics/0.log" Jan 21 15:04:34 crc kubenswrapper[4720]: I0121 15:04:34.998739 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-lsrs9_8ba45f1e-4559-4408-b129-b061d406fce6/frr-k8s-webhook-server/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.021249 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b8c8cff46-cbv67_b6fdd799-fe82-4cd7-b825-c755b6189180/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.037825 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-75df998c5f-tnbdz_6c334ce5-b6c7-40c8-a261-5a5084ae3db8/webhook-server/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.353719 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/speaker/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.370126 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-m7fv6_49f5ffd3-2bfb-4b94-b2a7-2aa923c730e1/kube-rbac-proxy/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.397923 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-q2t2m_655f8c6a-4936-45d3-9538-66ee77a050d3/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.412123 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/extract/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.424307 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/util/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.429480 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/pull/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.476315 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-wnzfm_b7ea6739-9c38-44a0-a382-8b26e37138fa/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.498547 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-bjn2r_96218341-1cf7-4aa1-bb9a-7a7abba7a93e/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.546534 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-gwlgm_6c93648a-7076-4d91-ac7a-f389ab1159cc/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.556428 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-bl4z8_9a5569f7-371f-4663-b005-5fdcce36936b/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.565241 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vfxfh_071d4469-5b09-49a3-97f4-239d811825a2/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.771118 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xtpbn_b80cffaf-5853-47ac-b783-c26da64425ff/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.782677 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-glbt4_9b467fa8-1984-4659-8873-99c20204b16b/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.835203 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-54hwg_085a2e93-1496-47f3-a7dc-4acae2e201fc/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.846463 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-n5bwd_370e5a87-5edf-4d48-9b65-335400a84cd2/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.877047 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-v4fbm_589a442f-27a6-4d23-85dd-9e5b1556363f/manager/0.log" Jan 21 15:04:35 crc kubenswrapper[4720]: I0121 15:04:35.925406 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d22bk_c38df2a4-6626-4b71-9dcd-7ef3003ee693/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.006974 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vzzmp_bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.025496 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-pw4z6_9695fd09-d135-426b-a129-66f945d2dd90/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.040153 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw_88e81fdb-6501-410c-9452-d3ba7f41a30d/manager/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.183134 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68fc899677-pbmmn_d3800217-b53a-4788-a9d4-8861cfdb68a1/operator/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.731495 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-d6jp2_4eec0898-8a1a-47d9-ac37-62cfe6c7b857/cert-manager-controller/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.767884 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-c4tn5_4939bfdd-b3b4-4850-8b5d-3399548ad5a0/cert-manager-cainjector/0.log" Jan 21 15:04:36 crc kubenswrapper[4720]: I0121 15:04:36.778307 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-vflwv_0236eaa4-e5d8-4699-82f8-1e9648f95dc8/cert-manager-webhook/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.027112 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d47656bc9-4hjmr_eb81b686-832a-414b-aa66-cf40a72a7427/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.044951 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j4xn9_5d59157d-f538-4cb0-959d-11584d7678c5/registry-server/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.116688 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-689zh_88327b24-ce00-4bb4-98d1-24060c6dbf28/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.154019 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2clln_18ce7f0d-00de-4a92-97f2-743d9057abff/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.181006 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mm7cg_8db4bced-5679-43ab-a5c9-ba87574aaa02/operator/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.194108 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-4tjlt_a2557af5-c155-4d37-9b9a-f9335cac47b1/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.242248 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-8hrkh_a050e31c-3d6d-490c-8f74-637c37c96a5e/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.253353 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-xczlv_cd17e86c-5586-4ea9-979d-2c195494fe99/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.267259 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jfkfq_de2e9655-961c-4250-9852-332dfe335b4a/manager/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.820987 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jtj6g_48af697e-308a-4bdd-a5d8-d86cd5c4fb0c/control-plane-machine-set-operator/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.834801 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/kube-rbac-proxy/0.log" Jan 21 15:04:37 crc kubenswrapper[4720]: I0121 15:04:37.844148 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h9ckd_1a75d5de-a507-41ca-8206-eae702d16020/machine-api-operator/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.031295 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-q2t2m_655f8c6a-4936-45d3-9538-66ee77a050d3/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.040624 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/extract/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.048151 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/util/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.057702 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cb9f7f9c3136cfd55efb8a19dabe48f877688a7289c979c7af167e1c21l7s9g_533f904c-bfa5-42e7-a907-5fe372443d20/pull/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.096087 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-wnzfm_b7ea6739-9c38-44a0-a382-8b26e37138fa/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.107096 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-bjn2r_96218341-1cf7-4aa1-bb9a-7a7abba7a93e/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.155457 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-gwlgm_6c93648a-7076-4d91-ac7a-f389ab1159cc/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.165184 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-bl4z8_9a5569f7-371f-4663-b005-5fdcce36936b/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.177322 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-vfxfh_071d4469-5b09-49a3-97f4-239d811825a2/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.411098 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-f9sxz_e3d11ff0-1741-4f0d-aa50-6e0144e843a6/nmstate-console-plugin/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.412008 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-xtpbn_b80cffaf-5853-47ac-b783-c26da64425ff/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.428391 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-l74mh_da16493b-aa03-4556-b3ce-d87ccfdbba70/nmstate-handler/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.429086 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-glbt4_9b467fa8-1984-4659-8873-99c20204b16b/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.444484 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/nmstate-metrics/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.454272 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-j9dxt_a26c9332-5a74-49a3-8347-45ae67cb1c90/kube-rbac-proxy/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.477250 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-mclmr_2bdd7be0-b9cf-4501-9816-87831d74becc/nmstate-operator/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.486520 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-xcckr_c338dc84-0c3a-44c4-8f08-82001f532c2b/nmstate-webhook/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.489024 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-54hwg_085a2e93-1496-47f3-a7dc-4acae2e201fc/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.499107 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-n5bwd_370e5a87-5edf-4d48-9b65-335400a84cd2/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.537378 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-v4fbm_589a442f-27a6-4d23-85dd-9e5b1556363f/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.580510 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-d22bk_c38df2a4-6626-4b71-9dcd-7ef3003ee693/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.657593 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-vzzmp_bb4052fb-6d6d-4e2a-b2f7-ee94c97512b5/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.666814 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-pw4z6_9695fd09-d135-426b-a129-66f945d2dd90/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.688454 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854hhcmw_88e81fdb-6501-410c-9452-d3ba7f41a30d/manager/0.log" Jan 21 15:04:39 crc kubenswrapper[4720]: I0121 15:04:39.831510 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-68fc899677-pbmmn_d3800217-b53a-4788-a9d4-8861cfdb68a1/operator/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.692053 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-d47656bc9-4hjmr_eb81b686-832a-414b-aa66-cf40a72a7427/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.703609 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-j4xn9_5d59157d-f538-4cb0-959d-11584d7678c5/registry-server/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.755903 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-689zh_88327b24-ce00-4bb4-98d1-24060c6dbf28/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.784735 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-2clln_18ce7f0d-00de-4a92-97f2-743d9057abff/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.806776 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mm7cg_8db4bced-5679-43ab-a5c9-ba87574aaa02/operator/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.824968 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-4tjlt_a2557af5-c155-4d37-9b9a-f9335cac47b1/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.896104 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-8hrkh_a050e31c-3d6d-490c-8f74-637c37c96a5e/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.904686 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7cd8bc9dbb-xczlv_cd17e86c-5586-4ea9-979d-2c195494fe99/manager/0.log" Jan 21 15:04:40 crc kubenswrapper[4720]: I0121 15:04:40.914457 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-jfkfq_de2e9655-961c-4250-9852-332dfe335b4a/manager/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.034344 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/kube-multus-additional-cni-plugins/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.043589 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/egress-router-binary-copy/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.053307 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/cni-plugins/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.060495 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/bond-cni-plugin/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.067567 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/routeoverride-cni/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.077898 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/whereabouts-cni-bincopy/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.086423 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5r9wf_14cdc412-e60b-4b9b-b37d-33b1f061f44d/whereabouts-cni/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.113599 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-7mfnf_92d3c944-8def-4f95-a3cb-781f929f5f28/multus-admission-controller/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.119101 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-7mfnf_92d3c944-8def-4f95-a3cb-781f929f5f28/kube-rbac-proxy/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.175244 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.212601 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85dm_a40805c6-ef8a-4ae0-bb5b-1834d257e8c6/kube-multus/1.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.241714 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x48m6_139c8416-e015-49e4-adfe-32f9e142621f/network-metrics-daemon/0.log" Jan 21 15:04:43 crc kubenswrapper[4720]: I0121 15:04:43.246813 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-x48m6_139c8416-e015-49e4-adfe-32f9e142621f/kube-rbac-proxy/0.log" Jan 21 15:04:52 crc kubenswrapper[4720]: I0121 15:04:52.880443 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:04:52 crc kubenswrapper[4720]: I0121 15:04:52.880967 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:22 crc kubenswrapper[4720]: I0121 15:05:22.880304 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:05:22 crc kubenswrapper[4720]: I0121 15:05:22.881724 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.879543 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.879982 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.880017 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.880535 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:05:52 crc kubenswrapper[4720]: I0121 15:05:52.880575 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7" gracePeriod=600 Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.036699 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7" exitCode=0 Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.036779 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7"} Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.037261 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerStarted","Data":"9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0"} Jan 21 15:05:54 crc kubenswrapper[4720]: I0121 15:05:54.037286 4720 scope.go:117] "RemoveContainer" containerID="c475a1e377102b3d19ed0ef4780ea7e71bffdcba82be28e30d50d8f856aa2827" Jan 21 15:08:00 crc kubenswrapper[4720]: I0121 15:08:00.715053 4720 scope.go:117] "RemoveContainer" containerID="a41cd8196de8dca42371cf925db6d045de3d0cbd2f7f8353d4af3ee985a4735d" Jan 21 15:08:22 crc kubenswrapper[4720]: I0121 15:08:22.880289 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:08:22 crc kubenswrapper[4720]: I0121 15:08:22.880937 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:08:52 crc kubenswrapper[4720]: I0121 15:08:52.880551 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:08:52 crc kubenswrapper[4720]: I0121 15:08:52.881131 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.879915 4720 patch_prober.go:28] interesting pod/machine-config-daemon-2pbsk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.880818 4720 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.880912 4720 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.882290 4720 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0"} pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 21 15:09:22 crc kubenswrapper[4720]: I0121 15:09:22.882899 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerName="machine-config-daemon" containerID="cri-o://9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" gracePeriod=600 Jan 21 15:09:23 crc kubenswrapper[4720]: E0121 15:09:23.016597 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.837718 4720 generic.go:334] "Generic (PLEG): container finished" podID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" exitCode=0 Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.837753 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" event={"ID":"c1128ddd-06c2-4255-aa17-b62aa0f8a996","Type":"ContainerDied","Data":"9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0"} Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.837799 4720 scope.go:117] "RemoveContainer" containerID="3bee635cc2c3c335bc129c259a16f2476ca04810986abca3de29789dac0840b7" Jan 21 15:09:23 crc kubenswrapper[4720]: I0121 15:09:23.838449 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:09:23 crc kubenswrapper[4720]: E0121 15:09:23.838758 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:09:36 crc kubenswrapper[4720]: I0121 15:09:36.678711 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:09:36 crc kubenswrapper[4720]: E0121 15:09:36.679968 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:09:47 crc kubenswrapper[4720]: I0121 15:09:47.677991 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:09:47 crc kubenswrapper[4720]: E0121 15:09:47.678680 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:00 crc kubenswrapper[4720]: I0121 15:10:00.678009 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:00 crc kubenswrapper[4720]: E0121 15:10:00.678768 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:13 crc kubenswrapper[4720]: I0121 15:10:13.678412 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:13 crc kubenswrapper[4720]: E0121 15:10:13.679162 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:24 crc kubenswrapper[4720]: I0121 15:10:24.678816 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:24 crc kubenswrapper[4720]: E0121 15:10:24.679625 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:39 crc kubenswrapper[4720]: I0121 15:10:39.678100 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:39 crc kubenswrapper[4720]: E0121 15:10:39.678969 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:10:52 crc kubenswrapper[4720]: I0121 15:10:52.691236 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:10:52 crc kubenswrapper[4720]: E0121 15:10:52.692118 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:06 crc kubenswrapper[4720]: I0121 15:11:06.678223 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:06 crc kubenswrapper[4720]: E0121 15:11:06.679781 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:20 crc kubenswrapper[4720]: I0121 15:11:20.681468 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:20 crc kubenswrapper[4720]: E0121 15:11:20.682370 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.257008 4720 generic.go:334] "Generic (PLEG): container finished" podID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerID="1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77" exitCode=0 Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.257199 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" event={"ID":"32ba91fa-9395-4dae-8bf6-384541b2d3ed","Type":"ContainerDied","Data":"1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77"} Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.257758 4720 scope.go:117] "RemoveContainer" containerID="1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77" Jan 21 15:11:23 crc kubenswrapper[4720]: I0121 15:11:23.589885 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/gather/0.log" Jan 21 15:11:31 crc kubenswrapper[4720]: I0121 15:11:31.985930 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:11:31 crc kubenswrapper[4720]: I0121 15:11:31.986652 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" containerID="cri-o://c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932" gracePeriod=2 Jan 21 15:11:31 crc kubenswrapper[4720]: I0121 15:11:31.993106 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ltcrl/must-gather-kz6gt"] Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.342003 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/copy/0.log" Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.342953 4720 generic.go:334] "Generic (PLEG): container finished" podID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerID="c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932" exitCode=143 Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.961891 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/copy/0.log" Jan 21 15:11:32 crc kubenswrapper[4720]: I0121 15:11:32.962560 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.035023 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") pod \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.036326 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") pod \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\" (UID: \"32ba91fa-9395-4dae-8bf6-384541b2d3ed\") " Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.042731 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx" (OuterVolumeSpecName: "kube-api-access-mvtcx") pod "32ba91fa-9395-4dae-8bf6-384541b2d3ed" (UID: "32ba91fa-9395-4dae-8bf6-384541b2d3ed"). InnerVolumeSpecName "kube-api-access-mvtcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.138547 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvtcx\" (UniqueName: \"kubernetes.io/projected/32ba91fa-9395-4dae-8bf6-384541b2d3ed-kube-api-access-mvtcx\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.238850 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "32ba91fa-9395-4dae-8bf6-384541b2d3ed" (UID: "32ba91fa-9395-4dae-8bf6-384541b2d3ed"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.240233 4720 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/32ba91fa-9395-4dae-8bf6-384541b2d3ed-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.362641 4720 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ltcrl_must-gather-kz6gt_32ba91fa-9395-4dae-8bf6-384541b2d3ed/copy/0.log" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.362981 4720 scope.go:117] "RemoveContainer" containerID="c33cd97c026b015df83cc6f96e3b1b70f009b429e334e4945f3e1e3052d31932" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.363122 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltcrl/must-gather-kz6gt" Jan 21 15:11:33 crc kubenswrapper[4720]: I0121 15:11:33.385790 4720 scope.go:117] "RemoveContainer" containerID="1c87134fcc0ac7700d916ae3b483f047e49613bd0b9fd19a14ad4f58b8e5db77" Jan 21 15:11:34 crc kubenswrapper[4720]: I0121 15:11:34.679155 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:34 crc kubenswrapper[4720]: E0121 15:11:34.679842 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:11:34 crc kubenswrapper[4720]: I0121 15:11:34.687923 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" path="/var/lib/kubelet/pods/32ba91fa-9395-4dae-8bf6-384541b2d3ed/volumes" Jan 21 15:11:49 crc kubenswrapper[4720]: I0121 15:11:49.678127 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:11:49 crc kubenswrapper[4720]: E0121 15:11:49.678825 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:03 crc kubenswrapper[4720]: I0121 15:12:03.678144 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:03 crc kubenswrapper[4720]: E0121 15:12:03.678900 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:18 crc kubenswrapper[4720]: I0121 15:12:18.678516 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:18 crc kubenswrapper[4720]: E0121 15:12:18.679406 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398021 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398892 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398903 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398916 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="gather" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398922 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="gather" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398932 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398938 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-utilities" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398948 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398953 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398965 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398971 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.398989 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.398997 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.399010 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399016 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="extract-content" Jan 21 15:12:30 crc kubenswrapper[4720]: E0121 15:12:30.399027 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399034 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399175 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a5b258-9d31-4031-85f0-1c8d00da3dda" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399186 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="gather" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399208 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ba91fa-9395-4dae-8bf6-384541b2d3ed" containerName="copy" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.399216 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfad437f-5d3f-4e35-88e2-1ee6a3a4b6e4" containerName="registry-server" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.400488 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.406380 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.437417 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.437490 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.437543 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540079 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540148 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540211 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540783 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.540829 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.559507 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"redhat-operators-bpd4p\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:30 crc kubenswrapper[4720]: I0121 15:12:30.729471 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.196061 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.678013 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:31 crc kubenswrapper[4720]: E0121 15:12:31.678622 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.868364 4720 generic.go:334] "Generic (PLEG): container finished" podID="fc75242e-0455-42d2-9539-105eceed64f5" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" exitCode=0 Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.868410 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24"} Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.868440 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerStarted","Data":"74fdee2e02ce3e51cc54b0243c921199f45d69bbbd52626ce7e362c84a2ea09a"} Jan 21 15:12:31 crc kubenswrapper[4720]: I0121 15:12:31.870885 4720 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 21 15:12:32 crc kubenswrapper[4720]: I0121 15:12:32.877520 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerStarted","Data":"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a"} Jan 21 15:12:36 crc kubenswrapper[4720]: I0121 15:12:36.924112 4720 generic.go:334] "Generic (PLEG): container finished" podID="fc75242e-0455-42d2-9539-105eceed64f5" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" exitCode=0 Jan 21 15:12:36 crc kubenswrapper[4720]: I0121 15:12:36.924211 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a"} Jan 21 15:12:38 crc kubenswrapper[4720]: I0121 15:12:38.941403 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerStarted","Data":"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3"} Jan 21 15:12:38 crc kubenswrapper[4720]: I0121 15:12:38.966724 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bpd4p" podStartSLOduration=2.598890247 podStartE2EDuration="8.966643425s" podCreationTimestamp="2026-01-21 15:12:30 +0000 UTC" firstStartedPulling="2026-01-21 15:12:31.870579469 +0000 UTC m=+2589.779319401" lastFinishedPulling="2026-01-21 15:12:38.238332647 +0000 UTC m=+2596.147072579" observedRunningTime="2026-01-21 15:12:38.964046425 +0000 UTC m=+2596.872786397" watchObservedRunningTime="2026-01-21 15:12:38.966643425 +0000 UTC m=+2596.875383357" Jan 21 15:12:40 crc kubenswrapper[4720]: I0121 15:12:40.730360 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:40 crc kubenswrapper[4720]: I0121 15:12:40.730750 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:41 crc kubenswrapper[4720]: I0121 15:12:41.776052 4720 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bpd4p" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" probeResult="failure" output=< Jan 21 15:12:41 crc kubenswrapper[4720]: timeout: failed to connect service ":50051" within 1s Jan 21 15:12:41 crc kubenswrapper[4720]: > Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.679077 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:45 crc kubenswrapper[4720]: E0121 15:12:45.680146 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.917368 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.920293 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:45 crc kubenswrapper[4720]: I0121 15:12:45.929243 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.024072 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.024144 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.024181 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.126157 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.126876 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.127046 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.127349 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.127409 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.145995 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"redhat-marketplace-jxjff\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.254005 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:46 crc kubenswrapper[4720]: I0121 15:12:46.790773 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:47 crc kubenswrapper[4720]: I0121 15:12:47.003758 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91"} Jan 21 15:12:47 crc kubenswrapper[4720]: I0121 15:12:47.004319 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"1c76c8ddc8c4ad7acb88ec4e3f6aae03bea3b9dd7c84d7a562ccaa3d54ee501e"} Jan 21 15:12:48 crc kubenswrapper[4720]: I0121 15:12:48.010833 4720 generic.go:334] "Generic (PLEG): container finished" podID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" exitCode=0 Jan 21 15:12:48 crc kubenswrapper[4720]: I0121 15:12:48.010885 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91"} Jan 21 15:12:49 crc kubenswrapper[4720]: I0121 15:12:49.024352 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13"} Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.033112 4720 generic.go:334] "Generic (PLEG): container finished" podID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" exitCode=0 Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.033157 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13"} Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.790853 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:50 crc kubenswrapper[4720]: I0121 15:12:50.834102 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:51 crc kubenswrapper[4720]: I0121 15:12:51.045535 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerStarted","Data":"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02"} Jan 21 15:12:51 crc kubenswrapper[4720]: I0121 15:12:51.067613 4720 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxjff" podStartSLOduration=3.621321766 podStartE2EDuration="6.067597139s" podCreationTimestamp="2026-01-21 15:12:45 +0000 UTC" firstStartedPulling="2026-01-21 15:12:48.014691074 +0000 UTC m=+2605.923431006" lastFinishedPulling="2026-01-21 15:12:50.460966457 +0000 UTC m=+2608.369706379" observedRunningTime="2026-01-21 15:12:51.064133977 +0000 UTC m=+2608.972873919" watchObservedRunningTime="2026-01-21 15:12:51.067597139 +0000 UTC m=+2608.976337071" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.083423 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.084029 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bpd4p" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" containerID="cri-o://fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" gracePeriod=2 Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.537604 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.662324 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") pod \"fc75242e-0455-42d2-9539-105eceed64f5\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.662426 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") pod \"fc75242e-0455-42d2-9539-105eceed64f5\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.662508 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") pod \"fc75242e-0455-42d2-9539-105eceed64f5\" (UID: \"fc75242e-0455-42d2-9539-105eceed64f5\") " Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.663589 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities" (OuterVolumeSpecName: "utilities") pod "fc75242e-0455-42d2-9539-105eceed64f5" (UID: "fc75242e-0455-42d2-9539-105eceed64f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.669855 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52" (OuterVolumeSpecName: "kube-api-access-fms52") pod "fc75242e-0455-42d2-9539-105eceed64f5" (UID: "fc75242e-0455-42d2-9539-105eceed64f5"). InnerVolumeSpecName "kube-api-access-fms52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.764963 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.765218 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fms52\" (UniqueName: \"kubernetes.io/projected/fc75242e-0455-42d2-9539-105eceed64f5-kube-api-access-fms52\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.812070 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc75242e-0455-42d2-9539-105eceed64f5" (UID: "fc75242e-0455-42d2-9539-105eceed64f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:12:53 crc kubenswrapper[4720]: I0121 15:12:53.867879 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc75242e-0455-42d2-9539-105eceed64f5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.072966 4720 generic.go:334] "Generic (PLEG): container finished" podID="fc75242e-0455-42d2-9539-105eceed64f5" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" exitCode=0 Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073017 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3"} Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073048 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bpd4p" event={"ID":"fc75242e-0455-42d2-9539-105eceed64f5","Type":"ContainerDied","Data":"74fdee2e02ce3e51cc54b0243c921199f45d69bbbd52626ce7e362c84a2ea09a"} Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073068 4720 scope.go:117] "RemoveContainer" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.073440 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bpd4p" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.094461 4720 scope.go:117] "RemoveContainer" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.116135 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.126827 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bpd4p"] Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.131762 4720 scope.go:117] "RemoveContainer" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.153609 4720 scope.go:117] "RemoveContainer" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" Jan 21 15:12:54 crc kubenswrapper[4720]: E0121 15:12:54.154137 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3\": container with ID starting with fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3 not found: ID does not exist" containerID="fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154173 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3"} err="failed to get container status \"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3\": rpc error: code = NotFound desc = could not find container \"fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3\": container with ID starting with fd117cdc0f437e9efd74b15021035f7c75e9817bd437b54fb1d3b690a71f35a3 not found: ID does not exist" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154197 4720 scope.go:117] "RemoveContainer" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" Jan 21 15:12:54 crc kubenswrapper[4720]: E0121 15:12:54.154623 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a\": container with ID starting with 5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a not found: ID does not exist" containerID="5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154695 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a"} err="failed to get container status \"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a\": rpc error: code = NotFound desc = could not find container \"5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a\": container with ID starting with 5d5dc767525f0e62f24c9c633361fd142da3afae162ff46fa8725d2777ad4c6a not found: ID does not exist" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.154723 4720 scope.go:117] "RemoveContainer" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" Jan 21 15:12:54 crc kubenswrapper[4720]: E0121 15:12:54.155027 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24\": container with ID starting with 9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24 not found: ID does not exist" containerID="9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.155049 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24"} err="failed to get container status \"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24\": rpc error: code = NotFound desc = could not find container \"9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24\": container with ID starting with 9ab58b08a3f47b6efb09401b865aec9a08dedba1996df54f7d00b8614a77ca24 not found: ID does not exist" Jan 21 15:12:54 crc kubenswrapper[4720]: I0121 15:12:54.690843 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc75242e-0455-42d2-9539-105eceed64f5" path="/var/lib/kubelet/pods/fc75242e-0455-42d2-9539-105eceed64f5/volumes" Jan 21 15:12:56 crc kubenswrapper[4720]: I0121 15:12:56.255034 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:56 crc kubenswrapper[4720]: I0121 15:12:56.255089 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:56 crc kubenswrapper[4720]: I0121 15:12:56.317050 4720 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:57 crc kubenswrapper[4720]: I0121 15:12:57.166405 4720 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:12:57 crc kubenswrapper[4720]: I0121 15:12:57.483051 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:12:58 crc kubenswrapper[4720]: I0121 15:12:58.679178 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:12:58 crc kubenswrapper[4720]: E0121 15:12:58.679979 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:12:59 crc kubenswrapper[4720]: I0121 15:12:59.115488 4720 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxjff" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" containerID="cri-o://ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" gracePeriod=2 Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.048820 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124192 4720 generic.go:334] "Generic (PLEG): container finished" podID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" exitCode=0 Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124232 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02"} Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124256 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxjff" event={"ID":"d48eb729-f085-415e-a0e1-8cf0be5b9547","Type":"ContainerDied","Data":"1c76c8ddc8c4ad7acb88ec4e3f6aae03bea3b9dd7c84d7a562ccaa3d54ee501e"} Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124271 4720 scope.go:117] "RemoveContainer" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.124385 4720 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxjff" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.141814 4720 scope.go:117] "RemoveContainer" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.164061 4720 scope.go:117] "RemoveContainer" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.190994 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") pod \"d48eb729-f085-415e-a0e1-8cf0be5b9547\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.191113 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") pod \"d48eb729-f085-415e-a0e1-8cf0be5b9547\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.191167 4720 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") pod \"d48eb729-f085-415e-a0e1-8cf0be5b9547\" (UID: \"d48eb729-f085-415e-a0e1-8cf0be5b9547\") " Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.195798 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities" (OuterVolumeSpecName: "utilities") pod "d48eb729-f085-415e-a0e1-8cf0be5b9547" (UID: "d48eb729-f085-415e-a0e1-8cf0be5b9547"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.197885 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp" (OuterVolumeSpecName: "kube-api-access-228jp") pod "d48eb729-f085-415e-a0e1-8cf0be5b9547" (UID: "d48eb729-f085-415e-a0e1-8cf0be5b9547"). InnerVolumeSpecName "kube-api-access-228jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.214794 4720 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d48eb729-f085-415e-a0e1-8cf0be5b9547" (UID: "d48eb729-f085-415e-a0e1-8cf0be5b9547"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.252323 4720 scope.go:117] "RemoveContainer" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" Jan 21 15:13:00 crc kubenswrapper[4720]: E0121 15:13:00.252859 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02\": container with ID starting with ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02 not found: ID does not exist" containerID="ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.252894 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02"} err="failed to get container status \"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02\": rpc error: code = NotFound desc = could not find container \"ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02\": container with ID starting with ca9d32d2f94af5a41983610267d69b9049bbe2c64f0accc233d75d883ffafa02 not found: ID does not exist" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.252934 4720 scope.go:117] "RemoveContainer" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" Jan 21 15:13:00 crc kubenswrapper[4720]: E0121 15:13:00.253288 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13\": container with ID starting with 2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13 not found: ID does not exist" containerID="2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.253325 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13"} err="failed to get container status \"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13\": rpc error: code = NotFound desc = could not find container \"2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13\": container with ID starting with 2f366cae8a61374e6f528fec14f6e9a02495f3b65e98eb925e5ccef151cf6a13 not found: ID does not exist" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.253340 4720 scope.go:117] "RemoveContainer" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" Jan 21 15:13:00 crc kubenswrapper[4720]: E0121 15:13:00.253802 4720 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91\": container with ID starting with 200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91 not found: ID does not exist" containerID="200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.253825 4720 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91"} err="failed to get container status \"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91\": rpc error: code = NotFound desc = could not find container \"200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91\": container with ID starting with 200fe4da97ffd136db4042df204fa1e2c0c459c3f8c3ad527f2845119f519a91 not found: ID does not exist" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.294509 4720 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-228jp\" (UniqueName: \"kubernetes.io/projected/d48eb729-f085-415e-a0e1-8cf0be5b9547-kube-api-access-228jp\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.294551 4720 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-utilities\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.294561 4720 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d48eb729-f085-415e-a0e1-8cf0be5b9547-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.457268 4720 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.463961 4720 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxjff"] Jan 21 15:13:00 crc kubenswrapper[4720]: I0121 15:13:00.689817 4720 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" path="/var/lib/kubelet/pods/d48eb729-f085-415e-a0e1-8cf0be5b9547/volumes" Jan 21 15:13:12 crc kubenswrapper[4720]: I0121 15:13:12.686413 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:13:12 crc kubenswrapper[4720]: E0121 15:13:12.687083 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:13:25 crc kubenswrapper[4720]: I0121 15:13:25.678070 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:13:25 crc kubenswrapper[4720]: E0121 15:13:25.678678 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:13:37 crc kubenswrapper[4720]: I0121 15:13:37.678971 4720 scope.go:117] "RemoveContainer" containerID="9d1eb32e86a82cb45054a8a1856f9cf6ce1c96c2b6a24080e4647d12db5165f0" Jan 21 15:13:37 crc kubenswrapper[4720]: E0121 15:13:37.679914 4720 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2pbsk_openshift-machine-config-operator(c1128ddd-06c2-4255-aa17-b62aa0f8a996)\"" pod="openshift-machine-config-operator/machine-config-daemon-2pbsk" podUID="c1128ddd-06c2-4255-aa17-b62aa0f8a996" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.805959 4720 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vnjjz"] Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806467 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806486 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806509 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806520 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806532 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806543 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-utilities" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806573 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806585 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806605 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806616 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="extract-content" Jan 21 15:13:39 crc kubenswrapper[4720]: E0121 15:13:39.806635 4720 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806647 4720 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.806998 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc75242e-0455-42d2-9539-105eceed64f5" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.807023 4720 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48eb729-f085-415e-a0e1-8cf0be5b9547" containerName="registry-server" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.808960 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.817774 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnjjz"] Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.920452 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt7k5\" (UniqueName: \"kubernetes.io/projected/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-kube-api-access-xt7k5\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.920920 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-catalog-content\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:39 crc kubenswrapper[4720]: I0121 15:13:39.921045 4720 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-utilities\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.022983 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-utilities\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.023701 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt7k5\" (UniqueName: \"kubernetes.io/projected/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-kube-api-access-xt7k5\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.024087 4720 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-catalog-content\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.024428 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-catalog-content\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.023621 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-utilities\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.058279 4720 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt7k5\" (UniqueName: \"kubernetes.io/projected/c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3-kube-api-access-xt7k5\") pod \"certified-operators-vnjjz\" (UID: \"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3\") " pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.140900 4720 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vnjjz" Jan 21 15:13:40 crc kubenswrapper[4720]: I0121 15:13:40.666541 4720 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vnjjz"] Jan 21 15:13:41 crc kubenswrapper[4720]: I0121 15:13:41.449512 4720 generic.go:334] "Generic (PLEG): container finished" podID="c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3" containerID="624cbc59330c55d150276766c1760e430d48c8e6cfc581498667d6c1bad0d164" exitCode=0 Jan 21 15:13:41 crc kubenswrapper[4720]: I0121 15:13:41.449568 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnjjz" event={"ID":"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3","Type":"ContainerDied","Data":"624cbc59330c55d150276766c1760e430d48c8e6cfc581498667d6c1bad0d164"} Jan 21 15:13:41 crc kubenswrapper[4720]: I0121 15:13:41.449764 4720 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vnjjz" event={"ID":"c6c3859e-bb69-4c03-b4c2-1a751e0bbdf3","Type":"ContainerStarted","Data":"fb5b52af19746d5479416778b0e6bc369f411deab7a84f12ba69f50850528450"}